| Many thanks for your thoughtful and detailed explanations. It now looks more
like the Quasimodi UI class might possibly be used to implement my Java
kernel. Certainly it looks worth looking into. When I perused the sources I
thought UI had something to do with GUI, but now it seems to be just the API
for Csound.
The threading issues seem amenable to me. Either Quaismodo can do all this
thread stuff internally, or we could break its threads out to Java and let
only Java spawn and destroy them. Perhaps you can define some classes that
expose the thread routines but not the thread creation and destruction, your
concrete UI class can do that, and a Java version of the concrete UI class
can also do that.
I welcome your interest in my proposal. I will try to make a rough cut of
integrating Quasimodo into my design, modifying the design if it seems
reasonable, and get it up on the list soon.
In the meantime, I have modified my design quite a bit after going over and
over the JavaSound APIs. I have appended the new design, which is still not
quite complete. At the lowest level, the actual native interfaces are all in
the following (and there are not many native methods after all). This
becomes a straight C++ class in the kernel code.
/**
* An implementation of SynthesizerKernel
* based on a platform-independent,
* re-entrant, multiply instantiable,
* double-precision rewrite of Csound with
* plugin unit generators and function tables.
*/
public class CsoundKernel extends SynthesizerKernel
{
public native long createKernel();
public native void destroyKernel(long kernel);
public native boolean read(long kernel, String filename);
public native boolean compile(long kernel, String xml);
public native void sendCommand(long kernel, String command);
public native int loadAllPlugins(long kernel, String directory);
public native int loadPlugin(long kernel, String filename);
public native boolean midiSendOpen(long kernel);
public native boolean midiSendClose(long kernel);
public native boolean isMidiSendOpen(long kernel);
public native void midiSend(long kernel, byte[] midiSendBytes);
public native boolean midiReceiveOpen(long kernel);
public native boolean midiReceiveClose(long kernel);
public native boolean isMidiReceiveOpen(long kernel);
public native synchronized int midiReceive(long kernel, byte[]
midiInBytes, Object monitor);
public native int getAudioSampleFramesPerSecond(long kernel);
public native void setAudioSampleFramesPerSecond(long kernel, int
value);
public native int getAudioSampleFramesPerControlSample(long kernel);
public native void setAudioSampleFramesPerControlSample(long kernel, int
value);
public native int getAudioInputChannelCount(long kernel);
public native void setAudioInputChannelCount(long kernel, int value);
public native boolean audioInputOpen(long kernel);
public native boolean audioInputClose(long kernel);
public native boolean audioInputIsOpen(long kernel);
public native synchronized int audioInputWrite(long kernel, float[]
audioInputBuffer, int offset, int length, Object monitor);
public native int getAudioOutputChannelCount(long kernel);
public native void setAudioOutputChannelCount(long kernel, int value);
public native boolean audioOutputOpen(long kernel);
public native boolean audioOutputClose(long kernel);
public native boolean audioOutputIsOpen(long kernel);
public native int audioOutputRead(long kernel, float[]
audioOutputBuffer, int offset, int length);
}
One thing I have done is made the synthesizer interface abstract, so that it
could be implemented by any number of different synthesizer "kernels", such
as a rewrite of Csound, Quasimodo, or SAOL. In other words, my proposal is
for an adapter class that fits general-purpose, academic, powerful software
synthesizers seamlessly into the JavaSound framework, which, the more I work
with it, seems reasonably flexible and useful. Only time can tell if the
implementation of the JavaSound APIs lives up to its good appearance.
From: Paul Barton-Davis
To: Michael Gogins
Cc: csound@maths.ex.ac.uk ;
music-dsp@shoko.calarts.edu ;
qm-dev@exp.firstpr.com.au
Date: Sunday, June 27, 1999 12:53 PM
Subject: Re: Proposal for a new version of Csound
>>with Gtk and other Linux type stuff, it's not clear where, if at all, the
>>GUI can be severed from the kernel.
>
>Its as completely separate as I could make it. There are just two
>"connections":
>
>1) in nmain.cc:
> UI *ui;
>
> if ((ui = ui_factory (&argc, &argv)) == NULL) {
> error << "Cannot start Quasimodo UI" << endmsg;
> return 1;
> }
>
>UI is an abstract class that declares the following functions:
>
> virtual bool running () = 0;
> virtual void quit () = 0;
> virtual void request (RequestType) = 0;
> virtual ModuleSet &default_module_set () = 0;
> virtual void new_module_set () = 0;
> virtual int nstate_query (size_t nstates, const char *prompt, ...) = 0;
> virtual void queue_error_message (Transmitter::Channel, const char *) = 0;
> virtual void run (Receiver *) = 0;
>
>Quasimodo is written so that the code to handle various tasks run in a
>different thread than the DSP simulator. This is the key to making it
>able to use multiple processors. Although the DSP thread could be
>thought of as the Csound kernel, thats not really true in Quasimodo:
>to implement all that Csound does, Quasimodo uses 3 threads: the DSP
>simulator, the real time event thread and some other thread (which
>might be running the UI) to handle input/loading and parsing of scores
>and orchestras.
>
>This doesn't look like a single function, and if you used it for Java,
>you'd presumably need a Java wrapper that invoked the 2 "other"
>threads, and took care of the tasks currently done by the UI thread.
>
>Its going to get more complex too: I am about to change things so that
>there is one DSP thread per ModuleSet, and then a final (quick)
>mixdown of the output of each DSP before it goes off to the
>soundcard. This allows me to use N>2 CPU systems efficiently, and even
>N=2 CPU's *more* efficiently, since when Quasimodo is running, its
>almost always just the DSP thread thats active, leaving a second
>processor sitting idle until "something else" happens. It also opens
>the door to sound output from Quasimodo coming from things other than
>Quasimodo DSP threads, if you see what I mean.
>I don't know how this is sounding so far. I do think that Quasimodo
>could really benefit from trying to be made into something close to
>what you described.
>
>--p
|