Csound Csound-dev Csound-tekno Search About

[Csnd] From gesture to sound

Date2024-11-08 16:29
FromEnrico Francioni <00000005323c8739-dmarc-request@LISTSERV.HEANET.IE>
Subject[Csnd] From gesture to sound
Hello everyone!
I speak about the topic gesture / sound (360 degrees).

I was wondering what could be the possible ways to make the (physical) gesture of a performer interact (directly or indirectly) with Csound.

Let me explain: with what tools could I capture the gestures of my body (for example of the arms, or of the headband for me who am a double bassist), make them data, then use that data for (for example) a control of the sound parameters within the Csound code, or to do something else in the csd?

I have seen some experiences such as the fantastic performance of Richard Boulanger at the ICSC of Cagli in 2019, or the sensors applied to the arm for the feedback control of Oevind Brandstegg, also an idea made with Pd and Processing and honestly not much else...

I read on the page: https://flossmanual.csound.com/other-communication/csound-and-arduino

That perhaps a first approach could be to put Csound in communication with Arduino through Processing... later I read about the existence of the Csoundo library, recently developed by Rory Walsh, to communicate Processing with Csound.

I have read several posts here on the list regarding the sensors and I wonder what could be a road already traveled that is in fact re-stradable?

Thanks for any advice!
Enrico

Csound mailing list
Csound@listserv.heanet.ie
https://listserv.heanet.ie/cgi-bin/wa?A0=CSOUND
Send bugs reports to
        https://github.com/csound/csound/issues
Discussions of bugs and features can be posted here

Date2024-11-08 18:58
FromPartev Sarkissian <0000060b2ef1338e-dmarc-request@LISTSERV.HEANET.IE>
SubjectRe: [Csnd] From gesture to sound

Center for Digital Music (C4DM) at Queen Mary University London (QMUL) 
has done interesting work in this area, might be worth looking into and apply 
it to a Csound project. 


-Partev 



On Friday, November 8, 2024 at 04:30:13 PM GMT, Enrico Francioni <00000005323c8739-dmarc-request@listserv.heanet.ie> wrote:


Hello everyone!
I speak about the topic gesture / sound (360 degrees).

I was wondering what could be the possible ways to make the (physical) gesture of a performer interact (directly or indirectly) with Csound.

Let me explain: with what tools could I capture the gestures of my body (for example of the arms, or of the headband for me who am a double bassist), make them data, then use that data for (for example) a control of the sound parameters within the Csound code, or to do something else in the csd?

I have seen some experiences such as the fantastic performance of Richard Boulanger at the ICSC of Cagli in 2019, or the sensors applied to the arm for the feedback control of Oevind Brandstegg, also an idea made with Pd and Processing and honestly not much else...


That perhaps a first approach could be to put Csound in communication with Arduino through Processing... later I read about the existence of the Csoundo library, recently developed by Rory Walsh, to communicate Processing with Csound.

I have read several posts here on the list regarding the sensors and I wonder what could be a road already traveled that is in fact re-stradable?

Thanks for any advice!
Enrico

Csound mailing list
Send bugs reports to
Discussions of bugs and features can be posted here

Date2024-11-08 19:49
FromLovre Bogdanić
SubjectRe: [Csnd] From gesture to sound
Hi Enrico! You could also try that with some computer vision library like MediaPipe and its pose detection: https://github.com/google-ai-edge/mediapipe/blob/master/docs/solutions/pose.md

Here is a link where I posted an example on how to track hands/fingers using mediapipe and control a csound instrument with it: https://forum.csound.com/t/short-latency-sound-control/1802/3
You could theoretically build up on that by tracking pose instead of hands and, of course, make more fitting csound instrument.

Best,
Lovre




On Fri, Nov 8, 2024 at 7:58 PM Partev Sarkissian <0000060b2ef1338e-dmarc-request@listserv.heanet.ie> wrote:

Center for Digital Music (C4DM) at Queen Mary University London (QMUL) 
has done interesting work in this area, might be worth looking into and apply 
it to a Csound project. 


-Partev 



On Friday, November 8, 2024 at 04:30:13 PM GMT, Enrico Francioni <00000005323c8739-dmarc-request@listserv.heanet.ie> wrote:


Hello everyone!
I speak about the topic gesture / sound (360 degrees).

I was wondering what could be the possible ways to make the (physical) gesture of a performer interact (directly or indirectly) with Csound.

Let me explain: with what tools could I capture the gestures of my body (for example of the arms, or of the headband for me who am a double bassist), make them data, then use that data for (for example) a control of the sound parameters within the Csound code, or to do something else in the csd?

I have seen some experiences such as the fantastic performance of Richard Boulanger at the ICSC of Cagli in 2019, or the sensors applied to the arm for the feedback control of Oevind Brandstegg, also an idea made with Pd and Processing and honestly not much else...


That perhaps a first approach could be to put Csound in communication with Arduino through Processing... later I read about the existence of the Csoundo library, recently developed by Rory Walsh, to communicate Processing with Csound.

I have read several posts here on the list regarding the sensors and I wonder what could be a road already traveled that is in fact re-stradable?

Thanks for any advice!
Enrico

Csound mailing list
Send bugs reports to
Discussions of bugs and features can be posted here
Csound mailing list Csound@listserv.heanet.ie https://listserv.heanet.ie/cgi-bin/wa?A0=CSOUND Send bugs reports to https://github.com/csound/csound/issues Discussions of bugs and features can be posted here
Csound mailing list Csound@listserv.heanet.ie https://listserv.heanet.ie/cgi-bin/wa?A0=CSOUND Send bugs reports to https://github.com/csound/csound/issues Discussions of bugs and features can be posted here

Date2024-11-11 14:11
FromTetsuya Miwa
SubjectRe: [Csnd] From gesture to sound
Hi Enrico,

You don’t need Processing or Pd for communication between Arduino and Csound. 
With the opcodes arduinoRead/arduinoReadF, serial data from Arduino can be easily handled as k-rate value.
You can use any sensors with Arduino to capture the gestures.

Another easy example is to use accelerometer and gyroscope of iPhone with touchOSC and Csound.
Check a template file “Sensors” in touchOSC. 

Best,
Tetsuya
Csound mailing list
Csound@listserv.heanet.ie
https://listserv.heanet.ie/cgi-bin/wa?A0=CSOUND
Send bugs reports to
        https://github.com/csound/csound/issues
Discussions of bugs and features can be posted here

Date2024-11-11 16:13
FromCharles Berman <00000559a66aba27-dmarc-request@LISTSERV.HEANET.IE>
SubjectRe: [Csnd] From gesture to sound
If one is running on Intel/nvidia on Windows, you might be interested in trying nvidia’s Maxine, which uses a regular video feed (webcam or even stored video) for body and face tracking.  It is not a complete Kinect replacement but it does quite a bit.

I’ve been experimenting with it on out-of-the-box Touch Designer, (which is IMO a great platform), but it could adapted to other platforms as well.

Some references: 

Hope this is helpful

On Nov 11, 2024, at 9:11 AM, Tetsuya Miwa <izc07036@NIFTY.COM> wrote:

Hi Enrico,

You don’t need Processing or Pd for communication between Arduino and Csound.
With the opcodes arduinoRead/arduinoReadF, serial data from Arduino can be easily handled as k-rate value.
You can use any sensors with Arduino to capture the gestures.

Another easy example is to use accelerometer and gyroscope of iPhone with touchOSC and Csound.
Check a template file “Sensors” in touchOSC.

Best,
Tetsuya
Csound mailing list
Csound@listserv.heanet.ie
https://listserv.heanet.ie/cgi-bin/wa?A0=CSOUND
Send bugs reports to
       https://github.com/csound/csound/issues
Discussions of bugs and features can be posted here


Date2024-11-20 15:34
FromAndreas Bergsland
SubjectRe: [Csnd] From gesture to sound
Hi Enrico
I've been working with getting data from human gestures/movements (mostly dancers) and translating it into sound in different ways over the years including computer vision techniques like Kinect, always using Csound to translate it into sound.
Lately I've been mostly into IMU sensors, basically the same as in your phone, which typically give you data on acceleration, angular velocity and orientation. The NGIMU sensors I use send OSC messages over wi-fi, making them very easy to use. They are also low latency (down to ca. 10 ms). 
Here's a video from some recent experiments I did with a dancer and choreographer using 5 NGIMU sensors and a 8 channel setup: https://youtu.be/EK7bkLWfuwY 
The sounds are made from five second running buffer containing the raw sensor data played back in different transpositions + some simple processing. The different dimensions (x,y,z) were a lot of the time routed to separate channels (but you can't hear that in the stero version online).
Best, Andreas

On 08/11/2024, 17:30, "A discussion list for users of Csound on behalf of Enrico Francioni"  on behalf of 00000005323c8739-dmarc-request@LISTSERV.HEANET.IE > wrote:


Hello everyone!
I speak about the topic gesture / sound (360 degrees).


I was wondering what could be the possible ways to make the (physical) gesture of a performer interact (directly or indirectly) with Csound.


Let me explain: with what tools could I capture the gestures of my body (for example of the arms, or of the headband for me who am a double bassist), make them data, then use that data for (for example) a control of the sound parameters within the Csound code, or to do something else in the csd?


I have seen some experiences such as the fantastic performance of Richard Boulanger at the ICSC of Cagli in 2019, or the sensors applied to the arm for the feedback control of Oevind Brandstegg, also an idea made with Pd and Processing and honestly not much else...


I read on the page: https://flossmanual.csound.com/other-communication/csound-and-arduino 


That perhaps a first approach could be to put Csound in communication with Arduino through Processing... later I read about the existence of the Csoundo library, recently developed by Rory Walsh, to communicate Processing with Csound.


I have read several posts here on the list regarding the sensors and I wonder what could be a road already traveled that is in fact re-stradable?


Thanks for any advice!
Enrico


Csound mailing list
Csound@listserv.heanet.ie 
https://listserv.heanet.ie/cgi-bin/wa?A0=CSOUND 
Send bugs reports to
https://github.com/csound/csound/issues 
Discussions of bugs and features can be posted here




Csound mailing list
Csound@listserv.heanet.ie
https://listserv.heanet.ie/cgi-bin/wa?A0=CSOUND
Send bugs reports to
        https://github.com/csound/csound/issues
Discussions of bugs and features can be posted here