Csound Csound-dev Csound-tekno Search About

[Csnd] Csound-based DATA Mapping and Sonification Models

Date2025-08-06 12:40
From"Dr. Richard Boulanger"
Subject[Csnd] Csound-based DATA Mapping and Sonification Models
Hello Csounders,

I am preparing new pieces, possibly in the MetaVerse with CsoundUnity, for the next ICSC and, if everything falls into place, for a premiere in Krakow, Poland, this late November.

The project involves both real-time sonification (mapping) of brainwave data, bio-sensor data, and other sensor data.  Also, I am working with non-real-time sonification (collected in spreadsheets) of brainwave data and various sensor data.  And, finally, I am also working with real-time and non-real-time data collected from water sensors, and other sensors (or even online).

Do any of you have any Csound-based work that you have done yourselves, collected over the years, or studied along these lines, that you might share with me?  

Perhaps there are some links, readings, or references that you might suggest I study, too.

Thank you in advance for your suggestions, recommendations, and inspiration.

- Dr.B


Dr. Richard Boulanger

Professor

Electronic Production and Design

Berklee College of Music

Professional Writing & Technology Division

Csound mailing list Csound@listserv.heanet.ie https://listserv.heanet.ie/cgi-bin/wa?A0=CSOUND Send bugs reports to https://github.com/csound/csound/issues Discussions of bugs and features can be posted here

Date2025-08-06 13:15
FromMichael Rhoades
SubjectRe: [Csnd] Csound-based DATA Mapping and Sonification Models
Hello Dr. B.

Your work sounds fascinating. Congrats!

At the Csound 2013 conference at Berklee, I presented a talk about several Large Hadron Collider sonifications I composed. I do not know if they would be of any assistance to you but below are links to the paper and to the sound files.

https://perceptionfactory.com/writing/lhc/
https://perceptionfactory.com/hadronized-spectra-2013/

I am currently working on a sonification project, using MAX (sorry). It involves mapping lists, in real-time at 60fps, of 3D pixel values generated by a Intel RealSense depth camera onto several variables in a granular synth that I created. It is quite challenging in that the data is not stable. Since, the camera is constantly attempting to triangulate the depth values... the pixels are moving targets. The goal is to synchronize the music to imagery that is also being generated based upon the depth camera data.

I would love to hear what you are doing.

Good luck!

Michael


On 8/6/25 7:40 AM, Dr. Richard Boulanger wrote:
Hello Csounders,

I am preparing new pieces, possibly in the MetaVerse with CsoundUnity, for the next ICSC and, if everything falls into place, for a premiere in Krakow, Poland, this late November.

The project involves both real-time sonification (mapping) of brainwave data, bio-sensor data, and other sensor data.  Also, I am working with non-real-time sonification (collected in spreadsheets) of brainwave data and various sensor data.  And, finally, I am also working with real-time and non-real-time data collected from water sensors, and other sensors (or even online).

Do any of you have any Csound-based work that you have done yourselves, collected over the years, or studied along these lines, that you might share with me?  

Perhaps there are some links, readings, or references that you might suggest I study, too.

Thank you in advance for your suggestions, recommendations, and inspiration.

- Dr.B


Dr. Richard Boulanger

Professor

Electronic Production and Design

Berklee College of Music

Professional Writing & Technology Division

Csound mailing list Csound@listserv.heanet.ie https://listserv.heanet.ie/cgi-bin/wa?A0=CSOUND Send bugs reports to https://github.com/csound/csound/issues Discussions of bugs and features can be posted here
-- 
Dr. Michael Rhoades
http://www.perceptionfactory.com
https://soundcloud.com/michael-rhoades-1
https://www.youtube.com/@mrhoades56

Lead HCI Artist/Researcher/Instructor
Institute for Digital Intermedia Art
Ball State University
https://idialab.org/

Date2025-08-06 13:29
From"Dr. Richard Boulanger"
SubjectRe: [Csnd] Csound-based DATA Mapping and Sonification Models
Hello Michael,

This work is awesome and inspiring.  Thank you for reminding me about it and sharing these links.
The paper is awesome, and the music is wonderful.  In heavy rotation now here in my studio!

Your new project sounds fascinating as well.
- Maybe you can use the csound~ or csound6~ object in your Max Patch ;)

I will keep you posted as my new work develops.

Take care.

- Dr.B


Dr. Richard Boulanger

Professor

Electronic Production and Design

Berklee College of Music

Professional Writing & Technology Division



On Wed, Aug 6, 2025 at 8:15 AM Michael Rhoades <mrhoades@perceptionfactory.com> wrote:
Hello Dr. B.

Your work sounds fascinating. Congrats!

At the Csound 2013 conference at Berklee, I presented a talk about several Large Hadron Collider sonifications I composed. I do not know if they would be of any assistance to you but below are links to the paper and to the sound files.

https://perceptionfactory.com/writing/lhc/
https://perceptionfactory.com/hadronized-spectra-2013/

I am currently working on a sonification project, using MAX (sorry). It involves mapping lists, in real-time at 60fps, of 3D pixel values generated by a Intel RealSense depth camera onto several variables in a granular synth that I created. It is quite challenging in that the data is not stable. Since, the camera is constantly attempting to triangulate the depth values... the pixels are moving targets. The goal is to synchronize the music to imagery that is also being generated based upon the depth camera data.

I would love to hear what you are doing.

Good luck!

Michael


On 8/6/25 7:40 AM, Dr. Richard Boulanger wrote:
Hello Csounders,

I am preparing new pieces, possibly in the MetaVerse with CsoundUnity, for the next ICSC and, if everything falls into place, for a premiere in Krakow, Poland, this late November.

The project involves both real-time sonification (mapping) of brainwave data, bio-sensor data, and other sensor data.  Also, I am working with non-real-time sonification (collected in spreadsheets) of brainwave data and various sensor data.  And, finally, I am also working with real-time and non-real-time data collected from water sensors, and other sensors (or even online).

Do any of you have any Csound-based work that you have done yourselves, collected over the years, or studied along these lines, that you might share with me?  

Perhaps there are some links, readings, or references that you might suggest I study, too.

Thank you in advance for your suggestions, recommendations, and inspiration.

- Dr.B


Dr. Richard Boulanger

Professor

Electronic Production and Design

Berklee College of Music

Professional Writing & Technology Division

Csound mailing list Csound@listserv.heanet.ie https://listserv.heanet.ie/cgi-bin/wa?A0=CSOUND Send bugs reports to https://github.com/csound/csound/issues Discussions of bugs and features can be posted here
-- 
Dr. Michael Rhoades
http://www.perceptionfactory.com
https://soundcloud.com/michael-rhoades-1
https://www.youtube.com/@mrhoades56

Lead HCI Artist/Researcher/Instructor
Institute for Digital Intermedia Art
Ball State University
https://idialab.org/
Csound mailing list Csound@listserv.heanet.ie https://listserv.heanet.ie/cgi-bin/wa?A0=CSOUND Send bugs reports to https://github.com/csound/csound/issues Discussions of bugs and features can be posted here
Csound mailing list Csound@listserv.heanet.ie https://listserv.heanet.ie/cgi-bin/wa?A0=CSOUND Send bugs reports to https://github.com/csound/csound/issues Discussions of bugs and features can be posted here

Date2025-08-07 13:04
FromOeyvind Brandtsegg
SubjectRe: [Csnd] Csound-based DATA Mapping and Sonification Models
Hi, rather old (2013), but I have the VLBI Music, based on data from the radio antennas in the Very Long Baseline Interferometry network. 
They are used to measure distant quasars, with the purpose of determining our own position in space. Used as reference for calibrating GPS and other location and mapping purposes. I have a page on it here https://www.researchcatalogue.net/view/55360/55361 

Recently, I did another mapping project. Looking at larvae from cold water corals and sea anemones, in connection with the project SHORES (https://shores.pt).
I have a Csound-only version, and also a version where I combine Csound synthesized sounds with the pipe organs (controlled over OSC) in Orgelpark, Amsterdam.
Will post about it, with code etc, but right now I have many moving parts in connection with semester start etc ;-)
Generally, the technical process is almost always to use Python for reading and massaging the data, then writing Csound scores or sending realtime OSC to Csound.

all best
Øyvind


ons. 6. aug. 2025 kl. 14:30 skrev Dr. Richard Boulanger <rboulanger@berklee.edu>:
Hello Michael,

This work is awesome and inspiring.  Thank you for reminding me about it and sharing these links.
The paper is awesome, and the music is wonderful.  In heavy rotation now here in my studio!

Your new project sounds fascinating as well.
- Maybe you can use the csound~ or csound6~ object in your Max Patch ;)

I will keep you posted as my new work develops.

Take care.

- Dr.B


Dr. Richard Boulanger

Professor

Electronic Production and Design

Berklee College of Music

Professional Writing & Technology Division



On Wed, Aug 6, 2025 at 8:15 AM Michael Rhoades <mrhoades@perceptionfactory.com> wrote:
Hello Dr. B.

Your work sounds fascinating. Congrats!

At the Csound 2013 conference at Berklee, I presented a talk about several Large Hadron Collider sonifications I composed. I do not know if they would be of any assistance to you but below are links to the paper and to the sound files.

https://perceptionfactory.com/writing/lhc/
https://perceptionfactory.com/hadronized-spectra-2013/

I am currently working on a sonification project, using MAX (sorry). It involves mapping lists, in real-time at 60fps, of 3D pixel values generated by a Intel RealSense depth camera onto several variables in a granular synth that I created. It is quite challenging in that the data is not stable. Since, the camera is constantly attempting to triangulate the depth values... the pixels are moving targets. The goal is to synchronize the music to imagery that is also being generated based upon the depth camera data.

I would love to hear what you are doing.

Good luck!

Michael


On 8/6/25 7:40 AM, Dr. Richard Boulanger wrote:
Hello Csounders,

I am preparing new pieces, possibly in the MetaVerse with CsoundUnity, for the next ICSC and, if everything falls into place, for a premiere in Krakow, Poland, this late November.

The project involves both real-time sonification (mapping) of brainwave data, bio-sensor data, and other sensor data.  Also, I am working with non-real-time sonification (collected in spreadsheets) of brainwave data and various sensor data.  And, finally, I am also working with real-time and non-real-time data collected from water sensors, and other sensors (or even online).

Do any of you have any Csound-based work that you have done yourselves, collected over the years, or studied along these lines, that you might share with me?  

Perhaps there are some links, readings, or references that you might suggest I study, too.

Thank you in advance for your suggestions, recommendations, and inspiration.

- Dr.B


Dr. Richard Boulanger

Professor

Electronic Production and Design

Berklee College of Music

Professional Writing & Technology Division

Csound mailing list Csound@listserv.heanet.ie https://listserv.heanet.ie/cgi-bin/wa?A0=CSOUND Send bugs reports to https://github.com/csound/csound/issues Discussions of bugs and features can be posted here
-- 
Dr. Michael Rhoades
http://www.perceptionfactory.com
https://soundcloud.com/michael-rhoades-1
https://www.youtube.com/@mrhoades56

Lead HCI Artist/Researcher/Instructor
Institute for Digital Intermedia Art
Ball State University
https://idialab.org/
Csound mailing list Csound@listserv.heanet.ie https://listserv.heanet.ie/cgi-bin/wa?A0=CSOUND Send bugs reports to https://github.com/csound/csound/issues Discussions of bugs and features can be posted here
Csound mailing list Csound@listserv.heanet.ie https://listserv.heanet.ie/cgi-bin/wa?A0=CSOUND Send bugs reports to https://github.com/csound/csound/issues Discussions of bugs and features can be posted here
Csound mailing list Csound@listserv.heanet.ie https://listserv.heanet.ie/cgi-bin/wa?A0=CSOUND Send bugs reports to https://github.com/csound/csound/issues Discussions of bugs and features can be posted here

Date2025-08-07 13:26
From"Dr. Richard Boulanger"
SubjectRe: [Csnd] Csound-based DATA Mapping and Sonification Models
Thank you Øyvind!

Your great Csound work is such an inspiration!

Dr. Richard Boulanger
Professor
Electronic Production and Design
Berklee College of Music

On Aug 7, 2025, at 8:04 AM, Oeyvind Brandtsegg <obrandts@gmail.com> wrote:


Hi, rather old (2013), but I have the VLBI Music, based on data from the radio antennas in the Very Long Baseline Interferometry network. 
They are used to measure distant quasars, with the purpose of determining our own position in space. Used as reference for calibrating GPS and other location and mapping purposes. I have a page on it here https://www.researchcatalogue.net/view/55360/55361 

Recently, I did another mapping project. Looking at larvae from cold water corals and sea anemones, in connection with the project SHORES (https://shores.pt).
I have a Csound-only version, and also a version where I combine Csound synthesized sounds with the pipe organs (controlled over OSC) in Orgelpark, Amsterdam.
Will post about it, with code etc, but right now I have many moving parts in connection with semester start etc ;-)
Generally, the technical process is almost always to use Python for reading and massaging the data, then writing Csound scores or sending realtime OSC to Csound.

all best
Øyvind


ons. 6. aug. 2025 kl. 14:30 skrev Dr. Richard Boulanger <rboulanger@berklee.edu>:
Hello Michael,

This work is awesome and inspiring.  Thank you for reminding me about it and sharing these links.
The paper is awesome, and the music is wonderful.  In heavy rotation now here in my studio!

Your new project sounds fascinating as well.
- Maybe you can use the csound~ or csound6~ object in your Max Patch ;)

I will keep you posted as my new work develops.

Take care.

- Dr.B


Dr. Richard Boulanger

Professor

Electronic Production and Design

Berklee College of Music

Professional Writing & Technology Division



On Wed, Aug 6, 2025 at 8:15 AM Michael Rhoades <mrhoades@perceptionfactory.com> wrote:
Hello Dr. B.

Your work sounds fascinating. Congrats!

At the Csound 2013 conference at Berklee, I presented a talk about several Large Hadron Collider sonifications I composed. I do not know if they would be of any assistance to you but below are links to the paper and to the sound files.

https://perceptionfactory.com/writing/lhc/
https://perceptionfactory.com/hadronized-spectra-2013/

I am currently working on a sonification project, using MAX (sorry). It involves mapping lists, in real-time at 60fps, of 3D pixel values generated by a Intel RealSense depth camera onto several variables in a granular synth that I created. It is quite challenging in that the data is not stable. Since, the camera is constantly attempting to triangulate the depth values... the pixels are moving targets. The goal is to synchronize the music to imagery that is also being generated based upon the depth camera data.

I would love to hear what you are doing.

Good luck!

Michael


On 8/6/25 7:40 AM, Dr. Richard Boulanger wrote:
Hello Csounders,

I am preparing new pieces, possibly in the MetaVerse with CsoundUnity, for the next ICSC and, if everything falls into place, for a premiere in Krakow, Poland, this late November.

The project involves both real-time sonification (mapping) of brainwave data, bio-sensor data, and other sensor data.  Also, I am working with non-real-time sonification (collected in spreadsheets) of brainwave data and various sensor data.  And, finally, I am also working with real-time and non-real-time data collected from water sensors, and other sensors (or even online).

Do any of you have any Csound-based work that you have done yourselves, collected over the years, or studied along these lines, that you might share with me?  

Perhaps there are some links, readings, or references that you might suggest I study, too.

Thank you in advance for your suggestions, recommendations, and inspiration.

- Dr.B


Dr. Richard Boulanger

Professor

Electronic Production and Design

Berklee College of Music

Professional Writing & Technology Division

Csound mailing list Csound@listserv.heanet.ie https://listserv.heanet.ie/cgi-bin/wa?A0=CSOUND Send bugs reports to https://github.com/csound/csound/issues Discussions of bugs and features can be posted here
-- 
Dr. Michael Rhoades
http://www.perceptionfactory.com
https://soundcloud.com/michael-rhoades-1
https://www.youtube.com/@mrhoades56

Lead HCI Artist/Researcher/Instructor
Institute for Digital Intermedia Art
Ball State University
https://idialab.org/
Csound mailing list Csound@listserv.heanet.ie https://listserv.heanet.ie/cgi-bin/wa?A0=CSOUND Send bugs reports to https://github.com/csound/csound/issues Discussions of bugs and features can be posted here
Csound mailing list Csound@listserv.heanet.ie https://listserv.heanet.ie/cgi-bin/wa?A0=CSOUND Send bugs reports to https://github.com/csound/csound/issues Discussions of bugs and features can be posted here
Csound mailing list Csound@listserv.heanet.ie https://listserv.heanet.ie/cgi-bin/wa?A0=CSOUND Send bugs reports to https://github.com/csound/csound/issues Discussions of bugs and features can be posted here