Multi-sensory gaming provides an accessible “pong” experience. Unlike traditional pong, sonified pong invites users to track the ball with auditory queues. Play with your eyes closed for an extra challenge!
This project provided proof of concept for a Tufts University senior capstone project to develop an accessible, auditory aide for independent swimming for the visually impaired.
While working with Matthew Shiffern (my fantastic intern at the Institute for Human Centered Design), who is blind from birth, I learned about the primarily text-based accessible games available to blind individuals. My sonified pong endeavors push beyond the limits of visual-only modes of communication with the player, providing for more interactive play.
Then, on the right panned auditory stream, I created a square wave to represent the movement of the ball in 2D space. Like the paddle, the X position of the ball is mapped to the square wave frequency. For the Y position of the ball I created a sine wave modulator so that as the ball gets closer to the bottom, the modulation gets more intense.
To display the ball contacting the boundaries or paddle, I created an auditory cue with a ping-pong delayed tone. Again, the frequency of this tone is mapped to the Y position of the collision to provide context: a bounce off the paddle is a low frequency, whereas off the wall is higher, and off the ceiling is highest.
Here is a video of Matthew Shifrin demoing the game play.
I chose the auditory textures to provide signifiers and contextual awareness of the game state. Matthew is able to understand the ball location in the context of the frame by listening to the frequency changes, modulation, and bouncing cues. While simultaneously tracking his paddle movements through frequency to block the ball at the right time. Along with hard pans, the unique sound textures of the two waveforms help to differentiate the audio stream.
I was amazed at the play-ability for Matthew, other blind gamers, and anyone seeking a non-visual dominated game experience. I wonder if this proof-of-concept can be applied to wayfinding within a constrained environment. That is, the "ball" could be an actual person in an environment, displaying a stream to heighten spatial perception. Matthew described a perfect problem space for this application: he enjoys swimming, but cannot swim independently as he must bring a friend to literally whack him on the head before he crashes into the walls.
This year I am sponsoring and leading a group of Tufts University senior capstone students to attack this problem. We are currently prototyping and user-testing an underwater ultrasonic sensor system with various auditory and haptic feedback streams to facilitate independent swimming for the visually impaired community.