lilypond-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Patch: Setting pan, chorus and reverb in MIDI


From: Heikki Tauriainen
Subject: Patch: Setting pan, chorus and reverb in MIDI
Date: Fri, 20 Sep 2013 22:01:56 +0300
User-agent: Internet Messaging Program (IMP) H3 (4.1.3)

Hi,

The discussion thread on lilypond-user a few weeks ago ("can I set
panning in midi" started by Karl Hammar on 6 September, dealing with the
options that are currently [not] available for adjusting the pan
position of MIDI channels in LilyPond-generated MIDI files) came so
close to thoughts I've had about having more explicit control over MIDI
parameters from within LilyPond source files, so that I could no longer
resist studying some more of the source code to try to learn more about
how the interaction between context properties and translators works.

As a result, I have a series of patches to add functionality to control
additional MIDI parameters (besides the MIDI instrument) directly from
within the LilyPond source (namely, pan position, and reverb and chorus
effect levels).  I post and describe them here for anyone else who might
be interested.

(About a year ago the project team was very kind to accept some MIDI event
handling patches of mine into the official sources, and create an issue
for tracking their status – should this happen again, I'm happy to answer
any questions about what I've tried to do.)


In short, these patches will add the following three context properties
for controlling additional MIDI parameters:

    Staff.midiPanPosition   [an integer from 0 to 127]
         Pan position (0 = hard left, 64 = center, 127 = hard right)
    Staff.midiReverbLevel   [an integer from 0 to 127]
         Reverb effect level (0 = no effect, 127 = full effect)
    Staff.midiChorusLevel   [an integer from 0 to 127]
         Chorus effect level (0 = no effect, 127 = full effect)

The properties can be changed like Staff.midiInstrument (using \set).
Just like MIDI instruments, these properties are MIDI channel specific:
which channel is actually affected in a given context depends on the
midiChannelMapping mode.  For simplicity, the integer values are simply
passed directly into the generated MIDI files without performing any
calculations on them (except for checking their range): for example, I
didn't implement any scaling for the values (which admittedly differs from
how MIDI volume is handled with midiMinimumVolume and midiMaximumVolume).

A simple example which demonstrates changing the pan position (whether
the effect can be actually observed will of course depend on the
capabilities of the MIDI synthesizer; for example, timidity handles the
pan effect just fine; I've successfully also tested the reverb and chorus
effects with FluidSynth + qsynth + pmidi):

====

\version "2.17.26"

\score {
  <<
    \new Staff \with { midiInstrument = #"oboe" } {
      \new Voice <<
        { c'1.~ c'1. }
        {
          \set Staff.midiPanPosition = #0 s1
          \set Staff.midiPanPosition = #64 s1
          \set Staff.midiPanPosition = #127 s1
        }
      >>
    }
    \new Staff \with { midiInstrument = #"oboe" } {
      \new Voice <<
        { g1.~ g1. }
        {
          \set Staff.midiPanPosition = #127 s1
          \set Staff.midiPanPosition = #64 s1
          \set Staff.midiPanPosition = #0 s1
        }
      >>
    }
  >>
  \midi { }
}

====

I divided the changes into a series of four separate patches; here's a
description of each patch with a link to the corresponding diff (against
development version 2.17.26):


1. First, a small fix to a possible mistake in a default value
   definition in ly/performer-init.ly, where it looks like that the Score
   context is (to my understanding) supposed to define a default MIDI
   instrument to use for any MIDI channel for which no instrument has been
   specified explicitly in the input .ly file.

   Namely, instead of defining a value for the "midiInstrument" property
   (which controls the current instrument in MIDI), the context defines
   the "instrumentName" property, which seems rather odd, since the latter
   property is relevant only (?) for layout.  (Also, the string to which
   the instrumentName property is initialized matches one of the MIDI
   instrument names listed in the Notation Reference.)

   I think this could be the reason why generating a MIDI file from a
   LilyPond source file that contains no explicit definition for
   midiInstrument will produce a MIDI file that includes no Program Change
   events, and, probably depending on the synthesizer used, could possibly
   be heard played in different ways (instrument-wise) if played after
   another file which leaves the channel instrument settings into values
   that differ from their (synthesizer) defaults.

   The patch simply changes the name of the property in
   ly/performer-init.ly.  Note that the following patches assume this
   patch to have already been applied: in particular, the final patch will
   partially fail otherwise since it touches the same section of code.
   (Apart from this point, the other patches do not depend on this one in
   any other way, so this patch can easily also be skipped if I've
   misinterpreted the purpose of the property definitions in
   ly/performer-init.ly.)

   ----
   Fix definition of default MIDI instrument in performer-init.ly
   <http://koti.welho.com/htauriai/lilypond/default-midi-instrument.diff>
   ----


2. The second patch defines new Audio_items and Midi_items for controlling
   MIDI parameters without yet adding any code which actually uses them.

   (I decided to follow the practice of defining each concrete item as a
   distinct C++ type; however, due to the similar handling of the
   parameters, the distinct types could easily be combined into a single
   class for representing any MIDI controller change – storing also the
   controller number in the items as a member – to reduce the duplication
   of nearly identical code.)

   ----
   New data types for controlling pan position, reverb and chorus levels
   <http://koti.welho.com/htauriai/lilypond/audio-and-midi-items.diff>
   ----


3. The third patch is needed to make it possible to register event
   listeners from performers (or translators) without using the
   DECLARE_TRANSLATOR_LISTENER, IMPLEMENT_TRANSLATOR_LISTENER, and
   ADD_TRANSLATOR macros.  The patch will make the functions
   Translator::connect_to_context and Translator::disconnect_from_context
   to be declared as virtual (making the functions virtual is actually
   already suggested in a comment in the original code).

   The need for this change comes from a straightforward idea of trying to
   handle changes to MIDI parameters using an event listener which would
   respond to some appropriate type of stream events generated by \set
   commands.  However, implementing such a listener turned out to be not
   as straightforward as I'd first thought (a report of my experiences
   follows):

   * Looking at the listener registration code in lily/context.cc, I
     gathered that \set commands in an input file would possibly generate
     "SetProperty" events when the file is processed.  However, I soon
     found out that the name of this event class cannot be successfully
     used as an event class ID in the above-mentioned macros since these
     macros (in effect) assume all event class names to end with the
     suffix "-event": this suffix always gets appended to the macro
     parameter representing the ID of the event class to listen to (when
     registering a listener for the event in the
     Translator::add_translator_listener function).

   * My first attempt was to try to work around this limitation by
     extending Translator::add_translator_listener to support all event
     classes defined in scm/define-event-classes.scm.  In this way I could
     use the DECLARE_TRANSLATOR_LISTENER, IMPLEMENT_TRANSLATOR_LISTENER,
     and ADD_TRANSLATOR macros to obtain a listener which seemed to
     respond to SetProperty events as I'd wanted, and even compile a
     working executable: however, for some reason which I haven't yet
     completely understood, the use of the "SetProperty" event class name
     broke the build of the internals manual (with a Texinfo error message
     about an undefined reference to a node called "SetProperty").

   * This led me to suspect that there might be something special about
     some of the event classes defined in scm/define-event-classes.scm
     (such as SetProperty and UnsetProperty) so that, in the worst case,
     these events might not actually be meant to be listened to at all in
     custom translators.  In this case I think that the patch probably
     shouldn't be accepted since there is a good chance that I've broken
     something with my modifications...

     (The Internals manual mentions SetProperty and UnsetProperty only in
     a [Scheme] list of event classes, but has no other documentation
     about them; instead, there is more documentation about entities
     called PropertySet and PropertyUnset [note the different spelling],
     which however do not match any event class names.  Do these
     identifiers have anything at all to do with each other?)

   * The failure to use the TRANSLATOR_LISTENER macros to implement a new
     performer which would detect MIDI property changes as they occur in
     the input event stream finally led me to implement the listener using
     the "lower level" DECLARE_LISTENER, IMPLEMENT_LISTENER, and
     GET_LISTENER macros (lily/include/listener.hh), in a similar way that
     they are used to implement listeners for the property
     set/unset/override/revert events in lily/context.cc.  However, since
     I thus need to bypass the Translator interface for registering
     listeners, I needed to be able to override the implementations of the
     Translator::connect_to_context and
     Translator::disconnect_from_context functions – hence this patch.

   Why then the need to listen to SetProperty events in the first place –
   couldn't the MIDI parameter changes be handled by listening to some
   other type of events (such as note events) instead, similar to how
   changes in MIDI instruments are already handled?   This is because I
   wanted to try whether I could implement support for constructs such as
   those used in the above example, where MIDI parameters may vary even in
   between individual notes.  (This could possibly be useful also if one
   were to try to implement more fine-grained control over MIDI volume for
   (de)crescendos.)  Since there's no note event between the tied notes, a
   performer listening to such events wouldn't be able to react to the
   change in the MIDI parameter at the proper moment in time.  (Maybe the
   same end result could be alternatively achieved by listening also to
   skip events.  Hmm.)

   ----
   Allow customizing implementations of Translator::connect_to_context and
   Translator::disconnect_from_context
   <http://koti.welho.com/htauriai/lilypond/translator.diff>
   ----


4. The final patch in this set adds code to introduce new context
   properties for changing the additional MIDI parameters, and to actually
   handle them in the input (by defining a new performer which will listen
   to SetProperty events as described above).

   Again, my first attempt at doing this was to try to make
   Staff_performer itself listen to SetProperty events and then add the
   relevant new Audio_items to the performer's Audio_staff objects in
   Staff_performer::acknowledge_audio_element, just as is done for note
   events.  (Handling the property change events requires determining the
   MIDI channel similarly to MIDI instrument changes, and access to an
   Audio_staff, hence the interest in
   Staff_performer::acknowledge_audio_element.)

   In the end (likely due to my still very poor understanding of the
   interaction of the various functions that get called in the translation
   process), I wasn't able to get this to work since I couldn't get
   Staff_performer to acknowledge audio elements that it had announced
   itself.

   For this reason, I created a new Midi_effect_performer class whose job
   is to announce the MIDI controller changes: just like note events,
   Staff_performer seems to have no problems acknowledging elements
   announced by this separate performer (after adding it to every Staff
   context defined in ly/performer-init.ly).  As a result, however, the
   implementation of handling the property changes is distributed among
   Midi_effect_performer and Staff_performer – I still wonder whether
   this could somehow be avoided.

   Furthermore, I am open to any suggestions about better default values
   to use for MIDI reverb and MIDI chorus levels (in ly/performer-init.ly)
   without altering previous behaviour too much (if this behaviour has
   been well-defined)  The patch will initialize both levels to their
   "50 %" values by default.

   ----
   Add support for setting MIDI pan position, reverb level, and chorus
   levels
   <http://koti.welho.com/htauriai/lilypond/handle-midi-effects.diff>
   ----


In summary, this series of patches appears to succeed in implementing the
new functionality (I've personally used it with success), but there might
still be room for many simplifications in the implementation.

Also, the changes made in the patches will alter the output of some of the
existing regression tests since the contents of generated MIDI files will
be different from before (the files will have additional controller events
to set default initial values for the new MIDI parameters, and possibly
also additional Program Change events to set the MIDI instrument to the
default one if this was not originally specified in the input file).  I've
run comparisons for the regression tests: besides these additions, the
only other changes I could see was some shuffling of MIDI events occurring
at the same moment in time, which I believe is harmless (as long as any
new "note on" events still appear last among these events).


Thank you for you attention,
Heikki Tauriainen





reply via email to

[Prev in Thread] Current Thread [Next in Thread]