Csound Csound-dev Csound-tekno Search About

Re: compiler

Date1998-07-07 01:04
FromMichael Gogins
SubjectRe: compiler
>I think this is a great project and one that is quite useful for
>the long-term development of the field.  This is one of the
>goals for the MPEG-4 Structured Audio project as well,
>and I would encourage you to examine and support that
>work for your purpose.

Let me deal with the most important question first. Suppose I took SAOLC,
the reference implementation, and without causing it to cease to implement
MPEG-4, gave it additional opcodes, a realtime scheduler, a DirectSound
filter graph driver model so it would work as a plugin inside Cakewalk Pro
Audio or you could play it with a MIDI keyboard. Could I then turn around
and just sell this thing? Or could I license it under the GNU license or the
GNU library license?

I have examined SAOLC from time to time over the past few years, including a
few weeks ago. It doesn't appear to have a phase vocoder or other builtin
time/frequency representation of sound like Csound's pvoc or adsyn. I think
they could probably be done using the existing opcodes; has anyone done it?
Are there any plans for time/frequency opcodes beyond fft and ifft?

>Having an open standard available which has the advancedcapabilities you
desire
>is a good way to speak to industry
>concerns for this functionality.  There will soon be industry
>development for MPEG-4 support, and it would be a shame
>if you did a lot of work on a Csound COM server only to
>find it didn't work with new, advanced hardware which
>supported MPEG-4 instead.


Actually, the Csound COM server would probably be a DircectShow filter, or
set of filters, and so could SAOLC or other implementations of MPEG-4.

I am starting to think about how best to structure these filter graphs as a
means of representing input, mixing, synthesis, remixing, and output signal
paths in a standard yet flexible way. I invite input.

>As you note this things -- all excellent suggestions I think -- youmay be
>interested to know that MPEG has recently decided to
>have a second version of MPEG-4.  All of the capabilities I've
>described previously will be in the first version; the second version
>will notably add Java capability and interfaces for manipulating
>the multimedia data.


What does "manipulating the multimedia data" mean? Reading soundfiles and
getting MIDI input in real time? This is what would be gained by having a
realtime scheduler and a filter graph architecture.