Advertisement

Beyond Sonification—lucid dreamer uses an EEG reader to turn dreams into image and soundscapes

Amateur neuroscience project transforms subconscious illusion into visual sensation

Beyond_sonification_1 Colin Harrington is a conventional research scientist with unconventional experiences with dreams. He endures nightly visions so intense that he would wake in the morning exhausted. Explanations of his strange bedtime occurrences were all too often dismissed and discounted by people saying “everyone has weird dreams.” Seeking solace, Harrington went to a sleep clinic and, with his head haloed by an EEG reader, two technicians watched his lucid dream.

Project Beyond Sonification is Harrington’s attempt at visually explaining lucid dreams—a dream state in which dreamers actively experience and remember their complex, multisensory hallucinations, and can even exert some degree of control in them. Lucid dreaming is accompanied by increased activation of parts of the brain that are normally suppressed by sleep, meaning Harrington’s brain activity during dreaming is as active as if he stayed awake all night watching television.

The project allows Harrington to create real-time cognitive and affective audio visual composition of lucid dreams using a consumer EEG. While it is not yet possible to show someone your dreams, Harrington, acting as project lead and the main subject matter, has figured out a way to turn his EEG readings into sounds and images that are representative of his dreaming state of mind. 

Beyond_sonification_2 Showcased at the most recent Maker Faire Bay Area, “Beyond Sonification” is captivating exercise in amateur neuroscience. Beneath a tent-like dome, spectators watched Harrington sleep with an EEG reader on his head.  During a REM state, his metal mood gets translated into music and images that are projected onto the dome with mapped responses triggered by events within the dream. Using the graphic programming language Max/MSP/Jitter, Harrington employs algorithmic models that can accurately detect emotion up to 92.3%.
  
The music engine for the project takes the emotional parameters that the EEG reader detects—excitement, meditation, anxiety, engagement, relaxation, and focus—and turns those into notes in a musical composition. Harrington has created a number of different programs through the reader software that works with thought to create musical compositions and other expressionist pieces. It takes some subjective leaps to translate combinations of emotions—excitement and engagement to mean happiness, meditation and anxiety to mean sadness—but readings of mood are most clear. For happiness, the music engine will stay in the major scales, sadness translates into minor scales and descends, while more turbulent mental state like anxiety will cause dissonant notes and increased tempo.

Colin Petty uses an EEG to make music with his thoughts and emotions

“I absolutely feel that it is much easier to explain my sleep disorder to others now that I have a way to visualize it.” Harrington said of the project. Spectators at the Maker Faire were able to experience a modified iteration of the project, showing waking brain activity in a dark room using the EEG reader. Harrington said that the Faire was an amazing opportunity to see sample sets from a large group without having an official study.

Sources: Make , Scientific American , Beyond Sonification

Advertisement



Learn more about
Electronic Products Digital

Leave a Reply