Research: The Sonification Handbook

Full handbook: https://sonification.de/handbook/download/TheSonificationHandbook-HermannHuntNeuhoff-2011.pdf

Chapter 1 - Introduction

https://sonification.de/handbook/download/TheSonificationHandbook-chapter1.pdf

Key takeaways

Definition of Sonification and Auditory Display

“Sonification is a core component of an auditory display: the technique of rendering sound in response to data and interactions”

My project is an act of sonification. The “data” is the shape which the user manipulates, and the resulting sound is the “rendering”. The final product can be described as an Auditory Display

Citation: Hermann, T., Hunt, A., Neuhoff, J. G. (2011). Introduction. In Hermann, T., Hunt, A., Neuhoff, J. G., editors, The Sonification Handbook, chapter 1, pages 1–6. Logos Publishing House, Berlin, Germany.

Chapter 15 - Parameter Mapping Sonification

https://sonification.de/handbook/download/TheSonificationHandbook-chapter15.pdf
PMSon = Parameter Mapping Sonification

Key takeaways

(15.1 Introduction) PMSon is the standard for Multivariate Data Display

PMSon is the core technique for Multivariate Data Display. My project involves linking visual properties (colour, roundness, size) to audio parameters (pitch, waveform, volume, sustain). PMSon is the established method for this, especially when there are multiple dimensions of data.

(15.4) Mapping Topology

The conceptual link between shape properties and Tone.js parameters must be explicitly defined. This translation/transfer function is the core of the application and is the fundamentals of the shape->sound relationship.

Mapping options to explore

one-to-one

“Can only be mappings to parameters of the signal domain since the parameters in the perceptual domain are generally not independent”

“one-to-one” can only be on a technical level, the perceived impact of this mapping will seem “one-to-many”.

one-to-many (divergent)

“Accounts for the fact that idiophonic objects usually change their sound characteristics in several aspects at the same time when varying”

Most realistic, and will feel most accurate to the synesthetic experience.

many-to-one (convergent)

“Can indirectly occur through the perceptual interdependence of sound synthesis parameters”

(15.5) Signal and Sound

Effective mapping requires accounting for the discrepancy between technical signal units (Hz) and perceptual units (pitch).

15.5.2 Perceptual Domain

“The psychophysical limits of just-noticeable differences (JND), masking, thresholds of hearing in the frequency, amplitude, and time domains, necessitates the constant interaction between ‘thinking’ and ‘listening’”

It’s important to take human perception into account. It would be very simple to make a shape -> sound mapping function on a technical level, but to design it to be intuitively perceived by a user is much more challenging.

Summary

“Parameter Mapping Sonification is widely used and is perhaps the most established technique for sonifying such data. Conceptually, acoustic attributes of events are obtained by a mapping from data attribute values. The rendering and playback of all data items yields the sonification.”

  • Bridging the gap between signal units and perceptual units is incredibly important. Understanding the differences and how to translate between the two will be key for this project.
  • Timbre is the perceived sound of a sound. There is no perceptual metric for timbre, as it is subjective as part of the individual human experience.

Citation: Grond, F. and Berger, J. (2011). Parameter mapping sonification. In Hermann, T., Hunt, A., Neuhoff, J. G., editors, The Sonification Handbook, chapter 15, pages 363–397. Logos Publishing House, Berlin, Germany.