»

Oct 10

Expressive MIDI Performance

Some say that the key to a realistic sounding MIDI performance is micro-quantizing of notes to create a human timing element. Positions of notes is certainly important, and their position relative to a grid depends very much on the style of music you’re writing, just like in real life. But there’s something more important than timing that gives you an expressive human sounding performance, and that is the use of expression controllers. At the simplest level this is typically note on velocity, referred by most DAW’s simply as velocity. Even most basic virtual instruments understand and react to this controller, by varying the volume, and sometimes the timbre of the instrument being played based on the note on velocity value. Moving on from velocity though, we’ll find that more advanced virtual instruments understand and react to other controllers too. Many at least respond to cc1 (modulation) as vibrato, and cc11 (expression) as a continual volume control.

Using expression controllers to their best requires an understanding of the programming of the virtual instrument you’re using, the understanding how the real instrument would work, and your understanding of the style and period of music you’re writing, as well as the instruments you’re using. A soprano doesn’t use the same style of vibrato as a classical trumpet player, and a classical trumpet player doesn’t use the same stlye of vibrato as a jazz solo trumpet player. So in order to use cc1 properly, you need to understand what a real player, performing the same music would do as well as interpreting that through instructions to your virtual instrument.

Here’s an example of how expressive controllers make a big difference – its the same example used in my demonstration of a virtual instrument called The Trumpet. In this first sample, I’m using some band instruments from the Kontakt 4 sample library – a grand piano and a trumpet.

I’ve used no expressive data in this performance. Each note has the same note on velocity value throughout. A pretty horrendous performance, I’m sure you’ll agree.

I can improve this significantly, by changing to higher quality samples. The Kontakt 4 ones came bundled with Kontakt 4… there are lots of them, so its not surprising the quality isn’t really up to much. I’ll change to better sample libraries now – NI’s Akoustic Piano and Sample Modeling’s The Trumpet. Still no performance controllers are used (except sustain pedal on the piano, and a starting expression value on the trumpet which it requires to make a sound).

Wow, what a difference eh? But still this is a very static performance, despite the fact that there was no quantizing involved in it – it was simply a live performance in two takes – one for piano, one for trumpet.

If I now restore the original dynamic performance of the instruments (note on velocity), and add expression data to the trumpet (cc11) as well as vibrato (cc1 – modulation) based on my knowledge of this ballad style of jazz improvisation and based on how I know the VI responds to controller data, we get something much closer to a real performance…

…and here’s an alternative version using a harmon mute on the trumpet.