Users recognize the image without seeing it because the information is transformed into audio or touch signals. But few people in the blind community actually use them because they are cumbersome and unpleasant to use.
That may change with EyeMusic, which transmits shape and color information through a composition of musical tones - "soundscapes." EyeMusic was developed by senior investigator Prof. Amir Amedi, PhD, and colleagues at the Hebrew University. It scans an image and uses musical pitch to represent the location of pixels. The higher the pixel on a vertical plane, the higher the pitch of the musical note associated with it. Timing is used to indicate horizontal pixel location. Notes played closer to the opening cue represent the left side of the image, while notes played later in the sequence represent the right side.
Additionally, color information, which most SSDs cannot utilize, is conveyed by the use of different musical instruments to create the sounds: white (vocals), blue (trumpet), red (reggae organ), green (synthesized reed), yellow (violin); black is represented by silence.
Using the EyeMusic SSD, both blind and blindfolded sighted participants were able to correctly identify a variety of basic shapes and colors after as little as 2-3 hours of training.
While this new study shows that the EyeMusic can enable the visually impaired to extract visual shape and color information using auditory soundscapes of objects, researchers feel that this device also holds great promise for the field of visual rehabilitation in general. By providing additional color information, the EyeMusic can help facilitate object recognition and scene segmentation, while the pleasant soundscapes offer the potential of prolonged use.
"There is evidence suggesting that the brain is organized as a task-machine and not as a sensory machine. This strengthens the view that SSDs can be useful for visual rehabilitation, and therefore we suggest that the time may be ripe for turning part of the SSD spotlight back on practical visual rehabilitation," Prof. Amedi adds. "In the future, it would be intriguing to test whether the use of naturalistic sounds, like music and human voice, can facilitate learning and brain processing relying on the developed neural networks for music and human voice processing."
Additionally, the researchers hope the EyeMusic can become a tool for future neuroscience research. "It would be intriguing to explore the plastic changes associated with learning to decode color information for auditory timbre in the congenitally blind, who never experience color in their life. The utilization of the EyeMusic and its added color information in the field of neuroscience could facilitate exploring several questions in the blind with the potential to expand our understanding of brain organization in general," concludes Prof. Amedi.