| I think that the limits of MIDI protocol in music applications are
essentially three:
1) data format limited to 7 bit integer giving a range of only 128
steps per parameter (very raw for most computer music applications)
2) slow hardware transfer rate which gives a maximum of 31000 bps,
traduced to a maximum theoric bandwidth of about 1000 -1500 messages for
second (in real cases very much lower). This rate can be enough for
keyboard solos, but it is completely insufficient for complex gestural
playing of instruments like woodwinds, strings and guitars.
3) a limited number of channel (16) and message types (128 control
changes, aftertouch, note on-off ecc.). Most control messages are
globals (channel), only poly aftertouch is provided for changing
independently for each note in the same channel.
I implemented some csound opcodes to overtake the first limit such as 14
and 21 bit data slider control. I will release within short time a
Visual Basic program (I'm sorry for the other platforms owners) which
can communicate via midi with these opcodes giving to the user a GUI for
moving sliders with the mouse in real time. This program will support 14
bit datas as
fixed point floats.
Gabriel Maldonado
http://www.agora.stm.it/G.Maldonado/home2.htm
|