First public test…
Code!
Here’s the code for the arduino and processing sketches. The arduino sketch is used by the microcontroller sewn in to´ the mask and the processing sketch is run on the computer controlling the input via the video camera. The arduino code relies on two libraries: the first is the publicly available TimerOne library; the second is a new library written during Science Hack Day SF 2011 called SoftwareTone which allows creation of square wave tones to any (and all) arduino digital pin in software. The SoftwareTone library will be uploaded to github shortly.
Process in pictures
Syneseizure! Unmasking reality!
Our consciousness does not contain reality – there is too much information in reality to handle, our brains would fry. Therefore, our various sensory cortexes have really complex data compression and processing algorithms to present us with the most useful information to help us kill antelopes, run away from lions and find attractive mates. This means we ignore a lot of reality. Perhaps we can see a different aspect of reality by listening to our eyeballs or looking
through our nose.
Synesthesia is a condition in which one sensation (sight, hearing, etc) gets mixed up with another. This can cause situations in which someone “smells” sounds, or “sees” touches.
For this hack, we designed and built a full head mask that allows the wearer to feel images in real time. The mask is arrayed with 12 speakers that contact the skin of the face. When an image is captured with a webcam and converted into a 12 pixel black-and-white representation. The computer activates arduinos that control the speakers. If the pixel is white, the corresponding speaker is turned on. If the pixel is black, the corresponding speaker is turned off. This allows the wearer to feel (via the vibrations of the speakers) an image on their face.










