Re: [Csnd] From gesture to sound
Date | 2024-11-08 16:36 |
From | Enrico Francioni <00000005323c8739-dmarc-request@LISTSERV.HEANET.IE> |
Subject | Re: [Csnd] From gesture to sound |
errata corrige: headband = the bow Csound mailing list Csound@listserv.heanet.ie https://listserv.heanet.ie/cgi-bin/wa?A0=CSOUND Send bugs reports to https://github.com/csound/csound/issues Discussions of bugs and features can be posted here |
Date | 2024-11-08 17:07 |
From | Giovanni Bedetti |
Subject | Re: [Csnd] From gesture to sound |
Hi Enrico, Some years ago I used Kinect cameras to track human bodies very easily. Now they're out of production but there are alternatives available, like the ORBBEC new 3D camera. I used Unity and a very cheap package, Kinect v2 Examples with MS-SDK. The recent camera I linked above should work with this package. I had very limited knowledge of Unity at that time, and it was very easy to get started. The advantage of using Unity is that you will have a fully integrated Csound wrapper you can use, CsoundUnity. Of course there are other approaches using open source tools like Processing. Before buying a Kinect v2 and using it on Unity, I did lots of experiments with a Kinect v1 on Processing, and it was pretty easy too. I think I used this package: OpenKinect-for-Processing Using Arduino would be a lot harder I guess, as you will need to do the wirings yourself and have a solid base of electronics. Hope this helps gio Il giorno ven 8 nov 2024 alle ore 17:36 Enrico Francioni <00000005323c8739-dmarc-request@listserv.heanet.ie> ha scritto: errata corrige: |
Date | 2024-11-08 17:12 |
From | Shane Byrne |
Subject | Re: [Csnd] From gesture to sound |
Hi Enrico, I have used the kinect in the past to interact with Csound via Processing. I read the incoming skeletal data with OpenNI (I think) I sent those values to Csound via OSC. The key for me in terms of workflow was to do all the normalisation and mapping in Processing so when I received the values in Csound I could put my composer hat on and forget about the parsing. Here's an example of one of the outcomes: https://www.shanebyrne.xyz/conatus That was all with the Kinect V1 though, you need to be a windows user to access the SDK for kinect V2 afaik. There was an unreleased LEAP motion library that a friend of mine developed years ago that I made use of in the past to capture physical gestures. I can dig it out if you'd like. I have also used the Intel Edison (no longer supported) and esp8266 to send IMU sensor data via OSC to Csound for a number of projects. Here's an early example that used an IMU to measure wave motion and trigger a generative engine in Csound: https://www.shanebyrne.xyz/soundwaves I'm happy to elaborate on the mechanics of these projects if you think it might help :) Shane. On Fri, Nov 8, 2024 at 4:36 PM Enrico Francioni <00000005323c8739-dmarc-request@listserv.heanet.ie> wrote: errata corrige: |