Hi All, (I'm crossposting to the blue users list so that they know what I'm up to, but would ask if they have technical comments to respond on the csound-devel list as it concerns Csound API usage, and other developers using the API may have the same interests) I was working on some music tonight with blue and had a MIDI keyboard hooked up and was trying to come up with ways to improve my composing experience. I think ideally I'd like to be able to get to do things that are commonly found in MIDI-based environments, and am breaking it down in the following: -Audio Engine is always-on -User can turn on or off the playback of the project; the program will send events to the ongoing instruments and effects that are feeding audio to the audio engine -All recorded events and event data is held in the host, host sends the data to the engine -Live events are sent to the host, host sends data to the engine For blue/csound, I am thinking: -blue leaves bare orchestra running in csound engine, no note events whatsoever -blue will hold copies of all note data, when play is pressed, blue will, in sync with csound engine, send notes to the csound engine via API -blue will intercept MIDI events, map according to rules given by user, and send notes to csound engine via API -if instrument/effects change or topology of signal graph changes, user intervention required for engine restart To achieve this I think blue needs a way to keep the engine always-on. Also, I think a method to turn all notes off will be necessary, so blue can feed notes, but if the user in blue decides to move the playback head or stop the score, blue can send the all notes off and csound can clear the notes that are queued up on it side. blue then would have to have it's own event list that is ticking along with csound's (not too bad as blue is already calling csound's performKsmps, and in the loop in blue it can add its own ticking of its event list). I think because some instruments need to stay always on, I would need to find a way to turn off all notes for all instruments except those instruments which I would mark as always-on. For the concern of live events, there are a couple approaches I'm thinking of: -blue intercepts MIDI and simply relays to Csound. This actually seems rather easy using the API callbacks for MIDI input and output. -blue intercepts MIDI and maps data, then sends standard Csound SCO to Csound. This seems much more powerful, as users can map MIDI notes to other values via Scala scales, send amp data mapped in different numerical ranges, or using an interface can modify in realtime what other pfield values will be sent to the user (will use blue convention of template notes as a foundation) I think either way, there will be some need by the user to modify their instruments to work either with MIDI or with SCO. My thought is that realtime SCO would be most flexible, and the basic note template could be kept as part of the instrument definition to make that portable (i.e. download an instrument from blueShare and have immediate ability to just play it with MIDI keyboard without any other work). Now, I'm trying to sort out if this is all possible already or if there's really anything that just isn't possible yet. I'm sort of thinking this through as I go along, so please bear with me: -I could run the project with "f0 3600" and no other notes -I could craft a "all notes off instrument" using turnoff2 that hits all the instruments in my project that are not always-on (actually, might be better to make an all-notes instrument that just worked on a single instrument; if user then disables an instrument in blue all the notes would get turned off for just that instrument, or if all notes off is necssary, blue can send multiple notes with each of the different instrument numbers as a pfield) Is there anything I'm missing? I figure the way to go about it is: -Get host/API stuff setup to send notes in realtime vs. using csound's internal SCO (these notes would be the "normal" notes) -Work with MIDI note data and mapping that information, sending "realtime" notes -Work with MIDI controller data, mapping to blue parameters, which would automatically work as blue parameters map to Csound data via API -Later allow for other inputs, i.e. OSC, user scripts, etc., which starts to bring in more of blueLive and other compositional systems I think this will probably be the next big evolution of blue/csound interaction and will further build on the API integration that was introduced with 0.124.0. I think too that this would take it as far one could go with csound's current design. The next step after that would introduce finer level of changing instruments in realtime (i.e. insert/remove/reconnect opcodes) which is something we've talked about here on the csound dev list before, but I think that is too much for my brain to handle at the moment. =P I'd appreciate any advice and thoughts people would have to give on this. I think just writing through this has already helped to figure out the issues and possible solution going forward and I think I may go ahead and start to experiment with this. Thanks! steven ------------------------------------------------------------------------- This SF.Net email is sponsored by the Moblin Your Move Developer's challenge Build the coolest Linux based applications with Moblin SDK & win great prizes Grand prize is a trip for two to an Open Source event anywhere in the world http://moblin-contest.org/redirect.php?banner_id=100&url=/ _______________________________________________ Csound-devel mailing list Csound-devel@lists.sourceforge.net