Csound Csound-dev Csound-tekno Search About

Re: [Csnd] [CUD] line event problem

Date1999-11-04 22:20
FromStefan Kersten
SubjectRe: [Csnd] [CUD] line event problem
BTW (1),
I followed your really detailed description earlier on the list (thanks
:), and it is no problem at all to work with the v_midi dev and OSS. The
performance however (when running csound with keykit) rather
disappointed me, maybe I have to invest in a faster machine ;)

BTW (2) [OT],
I also fiddled a bit with line events the last few days and still have a
lack of deeper understanding, thus theese questions:
Is there a portable (os independent) way to use files/devices for
interaction between csound and another process? It works fine on Linux
to use an ordinary file as i/o buffer, but not so on Windows. Does there
exist something equivalent to the concept of pipes?
I tried to use this principle while working on a csound-plugin for
netscape, which is intended to react on external events such as
javascript callbacks and thereby create realtime score events. Which
leads me directly to question #2:
I also thought of getting into csound's internals and use a function to
insert realtime events into the current event list, but as the sources
*really* extensive ;), I'm quite stuck right now. Maybe someone more
experienced could point me on what function to use and how, especially
what kind of initialization the different structs (e.g. LEVTBLK and
EVTBLK) need? For now, I'm thinking of 

linevent.c: static void Linsert(LEVTBLK *curp) /* insert blk into the
time-ordered linevent queue */
and
insert.c: int insert(int insno, EVTBLK *newevtp)  /* insert an instr
copy into active list */

but as I stated above I have no clue how to make use of them. The main
idea is to be able to construct bundles of score events which could then
be triggered otherwise.

I'm also sending this to the csound list, since I assume there might be
someone who could help me.

Thanx in advance, Stefan.

Paul Barton-Davis wrote:
> 
> >You da man, Eric ! That did it, now on to making it work with KeyKit.
> >I'm testing Larry Troxler's KeyKit code to drive Csound. As far as I can
> >tell, it's going to require a hard-coded f-table. Can I just use ftgen
> >(or whatever it's called) and put it in the orc ? I'll know soon...
> 
> dave - are you still using OSS ? why not just use the v_midi device,
> and then i don't think you need to do anything in particular to get
> keykit and csound to work together ...

__________________________________________________________________________
K-Labz [a K-Hornz subdivision] - steve-k@gmx.net - http://w3.to/K-Hornz
--

Date1999-11-04 23:18
FromLarry Troxler
SubjectRe: [Csnd] [CUD] line event problem
On Thu, 4 Nov 1999, Stefan Kersten wrote:

> BTW (1),
> I followed your really detailed description earlier on the list (thanks
> :), and it is no problem at all to work with the v_midi dev and OSS. The
> performance however (when running csound with keykit) rather
> disappointed me, maybe I have to invest in a faster machine ;)
> 

A faster machine doesn't seem like the right answer, though. Even a slow
machine should be fast enough for MIDI. So it sounds like there's some
scheduling problem somewhere. Maybe the v_midi driver is buggy? 

Larry Troxler


--

Date1999-11-05 00:18
FromMichael Gogins
SubjectRe: [Csnd] [CUD] line event problem
I had implemented realtime line events in AXCsound, by hacking as you had
contemplated. It worked fine. I decided to change AXCsound back from my
hacked version of the Csound sources (shared library, re-entrant, etc.) to a
plain old shell abain (runs good old consound or Winsound in a separate
process). The reason? I didn't want to have to keep maintaining my code in
synch with the "canonical" csound code. If the Csound community would like
Csound to be a software component with usable hooks, amenable to use in
different contexts such as Java, ActiveX, shared libraries, I've done half
the work and have an idea how to do the rest, but I'm not willing to keep
doing it over and over. I know that John ffitch is not an expert Windows
programmer and does not have an up to date Windows development system, so
perhaps the present situation is understandable, but it is not necessary.

If the community desires to make Csound more usable the following steps can
be taken:

Csound changes from an executable program with a main() function, into a
static library with various functions: sco and orc file readers, audio
input, audio output, command input, score line input, midi event input, midi
event output, console message output. This library is written ENTIRELY in
lowest-common-denominator runtime-library-only C++. The code is re-entrant
and multiply instantiable (several instances of Csound can run inside the
same process or address space, at the same time). The API for Csound
becomes:

//	C S O U N D
//	Copyright (c) 1999 by Michael Gogins. All rights reserved.
//	P U R P O S E
//	Declares a low-level interface for controlling Csound,
//	which is designed to be useful in as many different contexts as possible,
//	such as the JavaSound API, VST 2 plugins, Buzz machines, and others.
//	Therefore, Csound depends ONLY upon the builtin facilities of the ANSI
C++ language;
//	the synthesizer is supposed to be COMPLETELY platform-independent.
#ifndef Csound_h
#define Csound_h
#pragma warning(disable: 4786)
#include 
#include 

//	An abstract object for synchronizing access to objects
//	that are shared by different threads.

class AbstractMonitor
{
public:
	virtual ~AbstractMonitor(){};
	virtual void wait()=0;
	virtual void notifyAll()=0;
	virtual void enterMonitor()=0;
	virtual void exitMonitor()=0;
};

class Csound
{
public:
	virtual ~Csound(){};
    virtual long createKernel()=0;
    virtual void destroyKernel(Csound *kernel)=0;
	virtual void loadPlugins()=0;
    virtual bool read(std::string xmlFile)=0;
    virtual bool start()=0;
    virtual bool pause()=0;
    virtual bool resume()=0;
    virtual bool stop()=0;
    virtual bool commandInputOpen(AbstractMonitor *monitor)=0;
    virtual bool commandInputClose()=0;
    virtual bool commandInputIsOpen()=0;
    virtual void commandInputWrite(std::string command)=0;
    virtual bool midiInputOpen(AbstractMonitor *monitor)=0;
    virtual bool midiInputClose()=0;
    virtual bool midiInputIsOpen()=0;
    virtual int midiInputWrite(signed char *event, int size)=0;
    virtual bool midiOutputOpen(AbstractMonitor *monitor)=0;
    virtual bool midiOutputClose()=0;
    virtual bool midiOutputIsOpen()=0;
    virtual int midiOutputRead(signed char *buffer, int size)=0;
    virtual int getAudioSampleFramesPerSecond()=0;
    virtual bool setAudioSampleFramesPerSecond(int value)=0;
    virtual int getAudioSampleFramesPerControlSample()=0;
    virtual bool setAudioSampleFramesPerControlSample(int value)=0;
    virtual int getAudioInputChannelCount()=0;
    virtual bool setAudioInputChannelCount(int value)=0;
    virtual bool audioInputOpen(AbstractMonitor *monitor)=0;
    virtual bool audioInputClose()=0;
    virtual bool audioInputIsOpen()=0;
    virtual int audioInputWrite(float *audioInputBuffer, int count)=0;
    virtual int getAudioOutputChannelCount()=0;
    virtual bool setAudioOutputChannelCount(int value)=0;
    virtual bool audioOutputOpen(AbstractMonitor *monitor)=0;
    virtual bool audioOutputClose()=0;
    virtual bool audioOutputIsOpen()=0;
    virtual int audioOutputRead(float *audioOutputBuffer, int count)=0;
};

#endif	//	Csound_h

The AbstractMonitor thing is modeled on Java monitors and can be a proxy for
them, but can be implemented in any operating system; it's what allows the
Csound engine to synchronize input and output streams when running in a VST
or ActiveX plugin.

The csound executables, and specific implementations of AbstractMonitor, are
written separately for each platform and contain all system calls for
feeding data into and out of the Csound library calls.

The Csound library also can easily be given an ActiveX and JavaSound
interface with only the thinnest layer of glue, and the library is designed
to make it as easy as possible to write VST or ActiveX plugins using Csound
as an engine. I got line events and JavaSound MIDI input working using this
approach, but my code was diverging so far from the canonical sources I had
nightmares about what would happen to me if I tried to keep maintaining it.

Making these changes not only would enable Csound to become all kinds of
useful goodies, it would also make Csound itself much easier to maintain and
develop further, because the "engine" would be identical on all platforms.

Based on experience with AXCsound, I estimate that producing the "engine"
library is about 2 months of part-time work for one person, and each
platform's "consound" executable about 1 or 2 weeks of part-time work for
one person.

-----Original Message-----
From: steve@cs.tu-berlin.de [mailto:steve@cs.tu-berlin.de]On Behalf Of
Stefan Kersten
Sent: Thursday, November 04, 1999 5:21 PM
To: Paul Barton-Davis; Csound List
Subject: Re: [Csnd] [CUD] line event problem


BTW (1),
I followed your really detailed description earlier on the list (thanks
:), and it is no problem at all to work with the v_midi dev and OSS. The
performance however (when running csound with keykit) rather
disappointed me, maybe I have to invest in a faster machine ;)

BTW (2) [OT],
I also fiddled a bit with line events the last few days and still have a
lack of deeper understanding, thus theese questions:
Is there a portable (os independent) way to use files/devices for
interaction between csound and another process? It works fine on Linux
to use an ordinary file as i/o buffer, but not so on Windows. Does there
exist something equivalent to the concept of pipes?
I tried to use this principle while working on a csound-plugin for
netscape, which is intended to react on external events such as
javascript callbacks and thereby create realtime score events. Which
leads me directly to question #2:
I also thought of getting into csound's internals and use a function to
insert realtime events into the current event list, but as the sources
*really* extensive ;), I'm quite stuck right now. Maybe someone more
experienced could point me on what function to use and how, especially
what kind of initialization the different structs (e.g. LEVTBLK and
EVTBLK) need? For now, I'm thinking of

linevent.c: static void Linsert(LEVTBLK *curp) /* insert blk into the
time-ordered linevent queue */
and
insert.c: int insert(int insno, EVTBLK *newevtp)  /* insert an instr
copy into active list */

but as I stated above I have no clue how to make use of them. The main
idea is to be able to construct bundles of score events which could then
be triggered otherwise.

I'm also sending this to the csound list, since I assume there might be
someone who could help me.

Thanx in advance, Stefan.

Paul Barton-Davis wrote:
>
> >You da man, Eric ! That did it, now on to making it work with KeyKit.
> >I'm testing Larry Troxler's KeyKit code to drive Csound. As far as I can
> >tell, it's going to require a hard-coded f-table. Can I just use ftgen
> >(or whatever it's called) and put it in the orc ? I'll know soon...
>
> dave - are you still using OSS ? why not just use the v_midi device,
> and then i don't think you need to do anything in particular to get
> keykit and csound to work together ...

__________________________________________________________________________
K-Labz [a K-Hornz subdivision] - steve-k@gmx.net - http://w3.to/K-Hornz
--
To unsubscribe, send email to csound-unsubscribe@lists.bath.ac.uk

--

Date1999-11-05 22:00
FromPaul Barton-Davis
SubjectRe: [Csnd] [CUD] line event problem
[ ... michael's csound object elided ... ]

why is MIDI and Audio part of the interface for your Csound class ?
wouldn't they be better as one or more separate objects ?

i consider it really wrong that MIDI exists at the core of Csound, but i
have to admit that i haven't expunged it altogether from
Quasimodo. however, IMHO, it belongs only in plugins that interface
to/from MIDI and send/receive non-MIDI events from the engine. i don't
think that anything within the engine should even think about MIDI.

BTW Michael - do you use libsigc++ ? if you haven't taken a look at
it, its really worthwhile. est and myself would both lobby for its
inclusion into the next definition of the C++ language itself, if we
honestly thought there was any chance of it coming to pass. its just
totally sublime as a way of connecting objects without violating
object independence.

--p
--

Date1999-11-06 03:14
FromMichael Gogins
SubjectRe: [Csnd] [CUD] line event problem
Thanks for your response.

Why are MIDI and audio part of my Csound class? Actually, there ARE separate
MIDI and audio objects feeding data into and out of my Csound class, I just
didn't show them. What I showed is simply the lowest-level interface where
the data does somehow have to get into and out of Csound. In some cases the
objects would be quite simple, a single loop to feed audio into or out of a
file, in other cases (DirectX, VST, JavaSound) there would be separate
threads, extra buffering, signals and events, etc. But that would all be at
the operating system level, not the "Csound kernel" level.

You are right that it is debatable whether MIDI should exist at the core of
Csound. It would be better if it did not. That, however, would require a
fundamental redesign of Csound in which regular score events could
optionally have "off" events and associated "control" events, so that an
external MIDI driver could produce score "on" events and score "off" events.
But if there is no fundamental redesign, then why not bring MIDI data in,
the opcodes are already there and it is simpler that way.

I'm not familiar with libsigc++. Where does it live?


-----Original Message-----
From: Paul Barton-Davis [mailto:pbd@Op.Net]
Sent: Friday, November 05, 1999 5:00 PM
To: Csound List
Subject: Re: [Csnd] [CUD] line event problem


[ ... michael's csound object elided ... ]

why is MIDI and Audio part of the interface for your Csound class ?
wouldn't they be better as one or more separate objects ?

i consider it really wrong that MIDI exists at the core of Csound, but i
have to admit that i haven't expunged it altogether from
Quasimodo. however, IMHO, it belongs only in plugins that interface
to/from MIDI and send/receive non-MIDI events from the engine. i don't
think that anything within the engine should even think about MIDI.

BTW Michael - do you use libsigc++ ? if you haven't taken a look at
it, its really worthwhile. est and myself would both lobby for its
inclusion into the next definition of the C++ language itself, if we
honestly thought there was any chance of it coming to pass. its just
totally sublime as a way of connecting objects without violating
object independence.

--p
--
To unsubscribe, send email to csound-unsubscribe@lists.bath.ac.uk

--

Date1999-11-06 05:28
FromPaul Barton-Davis
SubjectRe: [Csnd] [CUD] line event problem
>Why are MIDI and audio part of my Csound class? Actually, there ARE separate
>MIDI and audio objects feeding data into and out of my Csound class, I just
>didn't show them. What I showed is simply the lowest-level interface where
>the data does somehow have to get into and out of Csound. 

your API shows the Csound object opening and closing MIDI and
devices. this is the part that seems wrong to me. or perhaps i'm
misinterpreting the function names ?

>You are right that it is debatable whether MIDI should exist at the core of
>Csound. It would be better if it did not. That, however, would require a
>fundamental redesign of Csound in which regular score events could
>optionally have "off" events and associated "control" events, so that an
>external MIDI driver could produce score "on" events and score "off" events.
>But if there is no fundamental redesign, then why not bring MIDI data in,
>the opcodes are already there and it is simpler that way.

right, the opcodes are already there. i just think that the MIDI data
should stop right at the opcodes. but you're right: as long as Csound
scores are based on the idea that note duration is known ahead of
time, MIDI has to exist in its own world, penetrating deep into the
core. 

>I'm not familiar with libsigc++. Where does it live?

http://www.ece.ucdavis.edu/~kenelson/libsigc++/

this is the new C++ "signal" system at the heart of the next version
of Gtk--

--

Date1999-11-06 16:55
FromMichael Gogins
SubjectRe: [Csnd] [CUD] line event problem
The open and close functions are for opening and closing the Csound side of
these interfaces - not for opening and closing the operating system devices.
This relates to synchronizing streams and so on. The client (the actual
Csound application) of the Csound object would (a) open an operating system
device,typically in an independent thread, (b) create an AbstractMonitor
object using OS synchronization primitives, (c) open the Csound object side
of the interface, (d) run the Csound object. The IO threads synchronize the
buffering of data between the OS devices and the Csound object by signaling
the monitor.

Thanks for the libsigc++ URL. This looks like useful stuff for writing the
OS device handlers themselves, but is not required in the C++ class for
Csound I am discussing here.

-----Original Message-----
From: Paul Barton-Davis [mailto:pbd@Op.Net]
Sent: Saturday, November 06, 1999 12:29 AM
To: Michael Gogins
Cc: Csound List
Subject: Re: [Csnd] [CUD] line event problem


>Why are MIDI and audio part of my Csound class? Actually, there ARE
separate
>MIDI and audio objects feeding data into and out of my Csound class, I just
>didn't show them. What I showed is simply the lowest-level interface where
>the data does somehow have to get into and out of Csound.

your API shows the Csound object opening and closing MIDI and
devices. this is the part that seems wrong to me. or perhaps i'm
misinterpreting the function names ?

>You are right that it is debatable whether MIDI should exist at the core of
>Csound. It would be better if it did not. That, however, would require a
>fundamental redesign of Csound in which regular score events could
>optionally have "off" events and associated "control" events, so that an
>external MIDI driver could produce score "on" events and score "off"
events.
>But if there is no fundamental redesign, then why not bring MIDI data in,
>the opcodes are already there and it is simpler that way.

right, the opcodes are already there. i just think that the MIDI data
should stop right at the opcodes. but you're right: as long as Csound
scores are based on the idea that note duration is known ahead of
time, MIDI has to exist in its own world, penetrating deep into the
core.

>I'm not familiar with libsigc++. Where does it live?

http://www.ece.ucdavis.edu/~kenelson/libsigc++/

this is the new C++ "signal" system at the heart of the next version
of Gtk--

--