I’m collaborating with the Philosophical Psychology Lab at Gordon College, working with philosopher Brian Glenney on experiments with the vOICe, also known as the Seeing With Sound device. The tool utilizes a camera, embedded in a pair of glasses or goggles, and translates its intake into “readable” soundwaves through earphones—either assigning sounds to light and dark values, or, in a modified version by lab member Zach Capalbo, into the basic color spectrum.
The device is designed with practical use in mind: as an adaptive tool for those who are blind. Glenney and his team are interested in its capacity to blur our normally discrete senses—activating the visual cortex with soundwaves, for instance. And we both got interested in the device as a tool for augmented reality—what kinds of experiences it makes possible that are otherwise absent in our typical sensory hierarchy?
After first getting some basic training in translating the sounds, I learned to perform a basic light-dark distinguishing search (as in the photo of me in this web site’s masthead). And then I participated in this two-person search game. Not since childhood had I played this kind of hide-and-seek-in-the-dark.
We each had red-lit hearts attached to our chests on backpacks. We could turn the lights off and on by listening to the location of our opponent, and lunging for the heart-light once in proximity.
This was just one of the experiments that ultimately led to our exhibit of drawings and video, “I Never Asked to Be Made Human,” now on display at Gordon College. More documentation and project discussion to come.