sonicafication of data?

Neil Harbisson

 

When I see “data visualization”, I always think about numbers and charts. We make the data become more understandable for people to read and easier for them to get the information we want to provide. During the weekend, I found a video from TED, and it made me think if we can have “data audiblization”.

The story is to use extra sensory by combining the idea of technology and human behavior to extend our sense when we see, feel, hear and even say. For Neil Harbisson, hearing is not a problem, it is seeing. He can see everything but colors. Neil is living in a black and white world, but he is an artist and painter. Instead of using eyes to distinguish and learn colors, he uses his ears.

For him, red is F, yellow is G, green is A…

Neil found he could not see color when he was 11. At the beginning, he refused to wear colors but he realized that it is hard to reject colors in everyday life. He went to art school and colors became mysteries and invisible elements he wanted to persist.

In 2003, he got an electronic eye. This thing is attached to his head, which is like an antenna loops over his head and attached to the end is a little camera. The little camera is what looks at and recognizes colors. A chip installed in the back of Neil’s head detects the frequency of colors and he hears colors through bone conduction.

When Neil goes to an art gallery, he hears the paintings like he goes to a concert. When he goes to a supermarket, he feels like he is in a nightclub.

His proceed of beauty is different from others: he hears the face. Someone looks beautiful but sounds terrible. Faces have sound portraits to him. Prince Charles “sounds” similar to Nicole Kidman when we compare their eyes. Two people who you probably never relate now have some sort of connections.

It is not just colors become sounds; sounds can be translated into colors in Neil’s mind. He can paint people’s voices. When music gets translated into colors, it will be easier to compare different artists. It is more visualizable to distinguish similar colors than to distinguish similar rhythms.

Mozart’s piece used many G, which is yellow. Justin Bieber’s songs have many E and G, which are pink and yellow. Artists share many similarities when they compose.

Justin Bieber's "Baby,"

Neil improved his devices to catch more colors than human eyes do. Now to him, a good day or bad day is based on different sounds.

The development of technologies and our daily life behaviors expand our database regarding humanities from traditional statistical numbers to even social network hashtags. Data interpretation also shifts from simple chars to fancy motional drop lines. Before this video, I never thought about to tranlsate human face into sounds. Now I’m thinking that professor Manvoich probably can add “sounds” as a component to his selfiecity project. Meanwhile, when sounds translate back to colors, I look at Justin Bieber’s song very differently now: they are pink.

2 thoughts on “sonicafication of data?

  1. JULIANA SON

    I didn’t watch the TED talk, but I did listen to the podcast with Neil Harbisson on NPR on a scenic drive to Albany. Seeing the fall foliage during the drive made me think what sounds Neil would hear.
    You bring up a very interesting point about the visual vs other senses. The field of visual studies, as you may already be aware, recognizes/criticizes the Western tradition of treating the sense of seeing as the top of the hierarchy of sensory experience (the “noblest” of senses). So what would a nonvisual or multisensory history/retelling/reimagining look like? Neil’s electronic eye is an interesting way of answering this.
    In the podcast (not sure if it was covered in the TED talk), Neil also talks about him feeling part human, part machine. (I think he actually called himself a cyborg.) There is definitely something sci-fi-y about Neil’s electronic eye. I’m still trying to figure out what Steve Jones would say about this.

  2. Mary Catherine Kinniburgh

    Thank you so much for sharing this. Particularly with the question of differently-abled bodies, and the types of bodies that DH privileges (and perhaps tech at large, too), de-privileging the “visual” in “visualization” and opening up that word seems to be particularly useful and important exercise.

    As Juliana mentions, there are cultural reasons we privilege sight, and our ideas of “nonvisual” definitely deserve expansion in the digital world. It’s also an interesting question, why we privilege sight, when so many people wear glasses–a disability, by the definition, that seems so commonplace that we don’t even consider it that anymore? And as I’m sure it’s a reality for many of us, the days spent at the computer working in academia also gradually tire the eyes–I need glasses now, after years of perfect vision.

    I mention these somewhat anecdotal pieces of evidence to highlight the important social work that Neil’s electronic eye performs, by situating pieces of technology in direct relationship to the body. We are, of course, always in this relationship (as I type right now, my hands on the keys), but like sight, we’ve experienced it to the extent that it becomes, well, invisible.

    Also noticing how hard it is to talk about bodies and technology without these ideas of “visibility and invisibility,” which is perhaps a later topic on what these ideas obscure or reveal…to pause for now: thanks for this read!

Comments are closed.