The research project is centered around the interplay of rhythm on brain function and the healing power of music and mental health, a life-long focus of Hart's. Eye Vapor became involved in the project to visualize this interplay and used TouchDesigner to produce real time visualization and sonification of brain activity with live data coming from an Emotiv headset.
Eye Vapor's TouchDesigner Performance Interface
Mickey Hart has long believed in the healing and regenerative power of music and asks the question: "What if in making music we were able to select and refine the rhythms, instruments and amplitude of that music to target and heal neural disorders?". “Breaking the rhythm code” as Mickey Hart describes this work, is his holy grail. Hart believes that knowing how rhythm affects the human brain will enable us to control and apply it medicinally, therapeutically and for diagnostic purposes. Hart believes this would allow us to reconnect synapses that are broken with Parkinson’s and alzheimer’s diseases. In short, the aim of this research and the ensuing visualization is to have medical science realise and embrace the healing powers of music - to bring “the power of rhythm to neuroscience” Hart says.
The research findings were revealed this past September during a live event at the AARP convention in New Orleans. UCSF cognitive neuroscientist Dr. Adam Gazzaley displayed the legendary drummer’s "brain on drums" as Hart's brain waves were visualized and projected onto a screen while he led the audience in a 1,000-piece drum circle.
"It all comes down to the vibrations and rhythm of things and how they interact, and now we have the real science," Hart says of his collaboration with Dr. Gazzaley. "Before, it was anecdotal. But every musician knows it works. When you get off the stage and your consciousness is elevated, there's a whole different kind of priorities in your body. We've never really been able to see the brain on music, this is a handshake between science and art."
We spoke to Eye Vapor about their involvement in this project but before continuing, watch the video above where Hart and Gazzaley give a demonstration of the system at AARP.
Derivative: Jeff, you’ve been working with TouchDesigner for a long time and before that Houdini. Can you tell us how this all began?
Jeff Smith: I started very early in the life of TouchDesigner as a beta tester in 2001. I was introduced to the software by Greg (Hermanovic) whom I got to know through working with Houdini in the late 90's and realized we had the same love of real time performance graphics. At that time I was working in the video game business, so TouchDesigner was a natural fit for doing real time art. Eye Vapor is a partnership between myself and Russ Haines. John Fesenko (who also works at UCSF) and Matt Watcher have worked on quite a few projects with us and are part of the Eye Vapor team. Eye Vapor was officially formed June of 2011 but has been in operation informally since fall of 2008.
D: How did you become involved in this project and how did this project come about?
Eye Vapor: The project started out as a collaboration between Mickey Hart and Adam Gazzaley on how rhythms in music and rhythms in the brain are not only similar but that music rhythms can influence brain rhythms and be healing.
John Fesenko is employed in research at UCSF and brought the Eye Vapor team in to do some real time visualization and sonification of brain activity. As the project progressed it became clear to Adam that we were going to deliver a quality of visualization never seen before in the neurological research industry and thus he and Mickey added a segment into their AARP talk where Adam presented a scan of Mickey's brain with live data coming from the Emotiv headset and being visualized on the model of his brain.
D: Can you perhaps clarify what "sonification” means in the context of brain activity?
Eye Vapor: Technically sonification refers to using sound as a way to perceive and evaluate data in ways that would not be possible or as effective using visuals or other data. So at a basic level you would hear data as different frequencies and pitch. We are using that term a bit more loosely and expanding it to include music created by brain waves.
EEG has been used as a source signal for sonification for quite a while -- going back to the 60's -- but this seems to be the first time that sonification has been coupled with real-time visualization of brain activity in front of a live audience.
D: Well what we can see in these videos is quite remarkable! What was the process - how did you accomplish this and how was TouchDesigner used?
Eye Vapor: One thing to keep in mind is that we look at TouchDesigner as a potential research tool as well as a means to create cool looking demos. How this all takes shape is still wide open but it's definitely on the radar. TouchDesigner was used here to receive the raw data from Emotiv, process the data to divide it up into various brain frequencies (Theta, Gamma, etc) and then drive a live visualization consisting of 70 channels of data.
A scan of Mickey Hart's brain was used as the basis for the visualization. Within TouchDesigner we split up the various regions of the brain corresponding to the sensor locations. This was done using CHOPs to find points that were close to the sensors and using a Point SOP to then drop the colors back into the correct region. The brain is rendered using a custom GLSL shader, the live visualization coloring done in a separate render and composited. There are 3 methods we are using for sonification, first is to drive a standard midi player for the piano sound, second is data driving Ableton Live and finally OSC data to Super Collider to create a sonification done by Mark Ballora of Penn State.
TouchDesigner also was used to send OSC data to Super Collider to create a sonification done by Mark Ballora of Penn State.
In the above video, piano notes are triggered by the activity of specific EEG frequency bands reaching a given threshold. Notes can be played chromatically or in any scale.
In the above video sounds are triggered by the activity of specific EEG frequency bands reaching a given threshold. Sounds are then modulated using the raw data as source and the rhythm and sonic placement of each sound is controlled via binural oscillations.
D: Was there any prior experience you drew upon to accomplish what you did?
Eye Vapor: Well, yes, lots of prior experience allowed us to do this.
First was that John Fesenko works in the neurology department at UCSF and has had extensive experience with brain research and software development. John created a set of band pass filters in TouchDesigner to process the raw brain waves. He also wrote a plugin for TouchDesigner that took raw data from the Emotiv headset.
From there, Matt Wachter and I have experience with a variety of clients that allowed us to create a nice looking and responsive visualization. In particular we've done a lot work with live audio visualization and performance art.
This is just the start of a long-term project with UCSF, Mickey, Adam and my team to design and create live brain activity visualization for research, consumer and entertainment markets. Nvidia is quite interested in the project and wants to feature it a super computing conference in March, as well as help us optimize performance with some custom CUDA coding. Also, Mickey Hart plans to feature live visualization and sonification in his upcoming shows.
In the above video audio is being created by modulating the pitch of synthesizers in Ableton using the activity level of specific EEG frequency bands.
D: John, you have all this data, numbers... what degree of scientific accuracy are you able to attain right now?
John Fesenko: Right now our visualization can only be considered scientifically accurate insofar as we used a brain model acquired through an actual scan of Mickey's brain, and represented real time EEG activity at the level of the individual electrodes on the scalp surface. Good for a demo, but not so useful for advancing science. The ultimate goal here is to use the electrode signals to study the actual activity in the brain in real time, or close enough to real time to allow for feedback studies.
This process, called EEG source localization (a form of super complex triangulation to determine brain surface activity), takes many minutes or even hours to do and so this information is only available offline. This is where NVIDIA comes in. With the help of CUDA-enabled processing we believe we can approach real-time monitoring of brain activity, not just as it appears at the scalp surface, but within specific neural networks that can open up a window to brain function in real time. Something that has been impossible thus far.
D: As someone involved in scientific work, would you like to learn and perhaps achieve using visual tools like TouchDesigner?
John Fesenko: As these types of real time feedback studies become more feasible, scientists will require a visualization tool that can render complex datasets as they change during an experiment, as well as provide an aesthetically pleasing scene for the experimental subject to interact with. Right now visualization applications designed for scientific use do not provide the real time interactivity needed for feedback studies. TouchDesigner, however, is flexible enough to provide both functions of real-time scientific visualization and user interaction and it will be interesting to see if its capabilities can be leveraged for such use in an integrated package.
We'd like to thank Jeff, John and Matt for talking with us and congratulate the team on their exciting work. For those interested in learning more about this facinating project we recommend a wonderful interview with Hart and Dr. Gazzaley at the Huffington Post and the following press release written by John Fesenko for Eye Vapor.
Dr. Adam Gazzaley, a neuroscientist at UCSF, has spearheaded a new project which will, for the first time, utilize immersive digital environments coupled with motion capture and streaming physiological data (EEG, skin conductivity, etc.) to enhance the study of attention and memory during normal aging and dementia. The use of total immersion as a research tool for advancing cognitive neuroscience is a very exciting development, and will likely provide opportunities to answer questions about brain activity and behavior in a controlled setting that can approach the complexity of how patients interact with their day to day environments.
As a proof of concept for the project, Dr. Gazzaley's team has created a demo that incorporates wireless EEG as well as sonification and visualization of the streaming EEG data in real time. With an eye to the future possibilities that an immersive space provides for the inclusion of not only scientifically accurate but aesthetically pleasing and engaging feedback, the quality of the demo goes beyond the usual representations of EEG activity to include visual elements hitherto associated primarily with video game production, as well as several sonification strategies that explore the often competing goals of realistic representation and listener engagement.
Both visualization and sonification are provided in real-time using streaming input, a clear requirement for any future feedback studies, accomplished in collaboration with Jeff Smith, the founder of Eye Vapor, a real-time visual effects company in LA. Furthermore, though the main crux of research would take place on site within the new neurosciences building at UCSF, the demo was designed to run on consumer-grade hardware, opening up the enticing possibility of in-home use by research subjects, especially those with compromised mobility. The demo was unveiled at the recent AARP convention in New Orleans as part of a joint discussion and performance piece with Mickey Hart of the Grateful Dead, who has taken an interest in the project as a means to study the positive effects of rhythm and music on brain function.