lilypond-user
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Bug in articulate.ly


From: Flaming Hakama by Elaine
Subject: Re: Bug in articulate.ly
Date: Fri, 27 Feb 2015 17:14:09 -0800


From: "H. S. Teoh" <address@hidden>

On Wed, Feb 25, 2015 at 08:58:07AM +1100, Peter Chubb wrote:
[...]
> BTW, articulate was developed as a hack for the Artemis robot
> instrument challenge; changes in Lilypond since then mean that it's
> rather out of date.  Some of the functionality is now already in the
> Lilypond C++ core (shortening non-legato notes); it'd be nice to clean
> it up.  Especially as at the time I taught myself scheme and lilypind
> internals enough to create the script; there are lots of things that
> are sub-optimal.  So it really needs a complete rewrite, using some of
> the ideas, but not much of the code.
[...]

I'm very interested in this! While I know that "lilypond is not a
sequencer" and all that, I'd like to be able to leverage lilypond's IMO
superior representation of music to drive nice (or rather, just
tolerable?) performances of my pieces. I'm willing to write my own
scripts, etc., to achieve what I want, but if much of the functionality
can be already available in the articulate script, it would save me a
lot of work.

I've recently been using lilypond with articulate.ly to produce midi versions of pieces and have to say that, despite its limitations, I've been getting results that are not terrible.

Among other things, it depends on the Virtual Instruments you use to render the sound.  Bad instruments will never sound good.

Decent results are only possible if you include a lot of detailed articulations and dynamics, perhaps more than you would put in a part intended for a human.  Although some schools of thought say this is good for printed parts, too.

Also, it requires tweaking the parameters in articulate.ly to match the style of music.  (To get optimal results, these will likely have to vary for each different virtual instrument.)

If you do all that, you can certainly get tolerable representations of scores.


The several things I noticed when using it were:

I had to add a "fullValue" articulation definition since it expects tenuto to be defined as full value, which is used as the definition for how long to hold tied notes. I found this straightforward to change, so now I can set tenuto notes to be less than full value, while still having tied notes at full value. 

It treats tied notes as two notes.  So, if you have something like:  d4-. d-. d-. r | r8 d4-. d8-. ~ d d4-. r8, and if your staccato setting is 50%, then the quarter notes will all sound at 50% duration, but the 8th tied to 8th will sound 100% for the first note because it is tied, and then whatever percentage that an unmarked note would hold.  If that unmarked note setting were 80%, then this tied note would sound at 90% of its combined value.

The main thing that a more sophisticated script would support is to add trigger notes (notes below the playable range of the instrument) which are used to indicate articulations in virtual instruments.  This would require turning notes into chords, or adding notes to chords, whenever the articulation changes.  The mapping of particular notes to particular articulations would need to be configurable for each staff (or voice?) since there is no standard.


Thanks for your contributions!

David Elaine Alt
415 . 341 .4954                                           "Confusion is highly underrated"
address@hidden
self-immolation.info
skype: flaming_hakama
Producer ~ Composer ~ Instrumentalist
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-

reply via email to

[Prev in Thread] Current Thread [Next in Thread]