Motivated by the observations during a field study with the blind community within the Autour project, I became intrigued by the possibility of rendering an acoustic description of physical art forms as an analogue to the experience that sighted people receive as they walk down the street or visit a museum. With this in mind, the audible-scultpures started and had the opportunity to collaborate with Florian Grond in this project.
As a proof of concept, we built small replicas of various existing sculptures; some of them found in the Mont Royal Park (in Montreal, Canada). We placed the replicas on a rotating knob that allowed our participants to explore them from different view points (by rotating the knob). This was analogous to going around the sculpture.
The exploration consisted of touching the sculpture and snapping fingers (or tongue clicking) from each different position. The sculpture replied back to each sound produced by the participant. The audible response was created by convolving the input sounds with a sonification of the sculpture that was based on the triangular mesh from the 3D model of the sculpture. We took into account the triangle size, its vertical position and the surface-normals with respect to the listener perspective. The interaction between the participant and the sculpture is illustrated in the following video.
This project was possible thanks to the Strategic Innovation Fund award from the Centre for Interdisciplinary Research in Music Media and Technology (CIRMMT) awarded to Jeremy Cooperstock (main advisor), Jean Piche, Gary Scavone, Zack Settel and Adriana Olmos. The project was developed at CIRMMT and the Shared Reality Lab led by Jeremy Cooperstock.
Visit the project blog and experience some sculptures through sound.