Three-Dimensional Psychoacoustic Auditory Display (3DAD)

In this research project we develop systematic sound (“psychoacoustic sonification”) to communicate high amounts of data to users. The Psychoacoustic Sonification can serve to complement or replace visualizations and helps us to learn more about the perceptual and cognitive dimensions of hearing.

 

Psychoacoustic Sonification Research

Auditory Displays present information by means of speech, information-poor sound elements, and information rich sonification. Sonification can present complex information in an abstract but intelligible way. Psychoacoustic sonification means that known principles of human auditory perception leveraged to maximize precision of data, orthogonality of multidimensional data, and intelligibility of complex data. Psychoacoustic Auditory Displays combine psychoacoustic sonification with additional sound elements.

 

Psychoacoustic Sonification

Psychoacoustics is a field that aims at translating physical sound input (acoustics) into perceptual, auditory output (psychology). The benefit of including psychoacoustic considerations in sonification design is two-fold. First, psychoacoustic sonification maximizes intelligibility.

 

Common problems of sonification designs: The representation of data dimensions is nonlinear, not orthogonal, discontinuous, partly uninterpretable, has a low resolution, no obvious coordinate origin and only one direction per axis (positive OR negative). Consequently, the exact location of each data point may be ambiguous. From (Ziemer et al. 2020).

Benefit of psychoacoustic sonification: The representation of data dimensions is linear, orthogonal, continuous, has a high resolution, an obvious coordinate origin and two directions per axis (positive AND negative). Consequently, the location of each data point is unambiguous. From (Ziemer et al. 2020).

 

Second, as psychoacoustic sonification is capable of interactive control in real-time, it can serve as a tool for research in the field of interactive psychoacoustics and embodied cognition.

Application Areas for Psychoacoustic Sonification

Application areas for psychoacoustic sonification include

  • audio guidance for drone navigation and image-guided surgery

  • data monitoring, like stock markets or physiologic patient data in the intensive care unit

  • human-computer and human-machine interaction in general

  • audio games, trainers and simulators and augmented reality environments

  • research in the field of interactive psychoacoustics with dynamic sounds, sound embodiment and auditory cognition

 

Data Visualizations

Data visualizations are all around. Visualizations help us to understand data and data changes, and to orientate in and navigate through (data) space. We use pictures and videos, maps, depictions and symbols, plots, charts, diagrams and text every day. We learned how to read them and intuitively interpret visualizations in every specific context. Unfortunately, this is not the case with sonification.

 

 

 

 

 

 

We see visualizations, like pie charts, plots, maps, GUIs, lists and tables every day. But have you ever heard sonifications, like auditory menus, auditory icons, earcons, auditory graphs and auditory maps?

 

Examples of Visual Displays

In some situations, there is too much to see. Such situations can cause “visual overload”: we may get overwhelmed, overstrained or exhausted and overlook, misinterpret or confuse information. Here, psychoacoustic sonification can reduce the cognitive demands.

 

Drivers cannot concentrate on all displays, sat nat, street signs, traffic and environment at the same time. Here, psychoacoustic sonification can deliver some information. For example it can act as acoustic speedometer, tachometer and fuel gauge.

Depending on the maneuver, pilots may need to consult a small subset of the dozens of displays that are distributed over the cockpit. Here, psychoacoustic sonification can inform the pilot about pitch, roll and yaw.

In image guided surgery, monitors show the three anatomic planes plus an augmented pseudo 3D-model of the anatomy. None of the visualizations shows the patient from the viewpoint of the surgeon. Furthermore, looking at the screen implies and unergonomic posture and takes the visual attention off the patient. Here, psychoacoustic sonification could guide the surgeon towards the tumor, past critical structures.

 

In other situations, visual displays cannot be consulted, due to their impractical location, or because they would occlude important parts of the visual field or redirect the visual focus away from important visual cues.

 

While aligning a shelf above your head, it is almost impossible to read the bubble level display. There is no direct sight line. The display is occluded by the shelf. One solution is an auditory Spirit Level App on your smartphone: sound waves need no direct sight lines.

Even semi-transparent depictions can occlude important parts of the camera view in drone navigation. However, psychoacoustic sonification could deliver all necessary navigation cues. Sound neither occludes nor takes your visual attention off the camera view. From (Ziemer et al. 2020).

 

Examples of Psychoacoustic Sonification

Psychoacoustic sonification can communicate communicate direct relationships, like a function f(x), or multi-dimensional relationships, like a function f(x, y, z).

Simple Psychoacoustic Sonification

Simple psychoacoustic sonification can be understood quite intuitively, like the auditory spirit level and auditory graphs.

Torpedo Level with a bubble indicator (bottom) and Spirit Level App (top): A bubble in the center indicates horizontal alignment. A bubble on the left means: tilt to the right. The further the bubble is off-center, the further you have to tilt.

Auditory Spirit Level: Steady pitch indicates horizontal alignment. A rising pitch means: tilt to the right. The faster the pitch changes, the further you have to tilt.

 

 

A visual graph: Even without further explanation, axis labels and magnitudes, you get an idea about the plotted relationship and can describe what you see.

Auditory Graph: The absolute value is mapped to pitch. Positive values sound smooth, negative values sound rough. This psychoacoustic sonification gives you about as much information as the visual version.

 

These represent one continuous dimension with a high resolution. The spirit level has a clear zero, i.e., an origin of the coordinate system at . In contrast, the auditory graph is relative. Its origin can only be recognized when surpassing it, due to the switch from smooth to rough.

 

 

Complex Psychoacoustic Sonification

Complex psychoacoustic sonification can represent large data vectors or matrices. The amount of simultaneously presented information is very high. The benefit of complex psychoacoustic sonification is the high amount of data that is presented simultaneously, unambiguously and linearly to the user. The downside is that its meaning has to be learned. Naturally, this takes time and effort for explanations and training.

 

Visual explanation of the psychoacoustic sonification for three-dimensions f(x, y, z). Users need no more than half an hour of explanation and demonstrations to understand the five sound attributes that describe a three-dimensional space. Most people can learn it, but they never did and they never knew how easy it would be. From (Ziemer et al. 2020).

Psychoacoustic sonification of a trajectory in three-dimensional space. Without explanations and a learning phase, the sound is hard to interpret. However, trained users can imagine the three-dimensional path just from listening to the sound.

 

Application areas of complex, multi-dimensional psychoacoustic sonification include navigation in (image-)guided surgery and surgical training, as well as data monitoring in pulse oximetry.

 

Image-guided surgery and neuronavigation are established practices. Psychoacoustic sonification enables clinicians to carry out complicated interventions by ear. From (Ziemer et al. 2020).

Psychoacoustic sonification can inform anesthesiologists about the patient’s blood oxygen concentration with high detail. Further body data, like blood pressure, respiration rate, and temperature could be added. Details can be found in (Schwarz & Ziemer 2018).

 

 

Further application areas of complex psychoacoustic sonification include audio games and flight simulators.

 

Psychoacoustic sonification can be integrated in computer games to complement visualizations, to enable blind gaming and to train your hearing abilities. From the CURAT-website.

Psychoacoustic sonification can be integrated in flight simulators and actual air planes to help pilots carry out maneuvers. When equipped with some sensors, manned and unmanned vehicles, vessels, and aircraft can be controlled be means of sound.

 

 

More information on Psychoacoustic Sonification

Further information on psychoacoustic sonification can be found on online:

  • Master the sonification by playing our Sonification Game
  • News and publications on psychoacoustic sonification in medicine can be found on Researchgate.

  • We tweet news from the Bremen Spatial Cognition Center on Twitter.

  • More sound examples and demo videos can be found on YouTube.

  • You can contact Tim Ziemer via e-mail and find his list of publications on Google Scholar.

  • With a YouTube subscription you can watch the latest videos on psychoacoustic sonification.

Group Members:

The psychoacoustic sonification research group has three members

  • Tim Ziemer (Principal Investigator, Bremen Spatial Cognition Center, University of Bremen)

  • Holger Schultheis (Researcher, Bremen Spatial Cognition Center, University of Bremen)

  • Kilian Krüger (Student Assistant)

 

 

and many partners

 

Honors and Awards:

 

Publications on Psychoacoustic Sonification:

Journal Papers

Ziemer, Tim, Nuchprayoon, Nuttawut & Schultheis, Holger, “Psychoacoustic Sonification as User Interface for Human-Machine Interaction”, accepted for International Journal of Informatics Society 11(3) 2020, doi: 10.13140/RG.2.2.14342.11848 (preprint).

Ziemer, Tim & Schultheis, Holger, “Psychoacoustic Auditory Display for Navigation. An Auditory Assistance System for Spatial Orientation Tasks”, in: Journal on Multi-Modal User Interfaces 13(3), pp. 205–218, 2019, doi: 10.1007/s12193-018-0282-2.

Ziemer, Tim, Schultheis, Holger, Black, David & Kikinis, Ron, “Psychoacoustical Interactive Sonification for Short-Range Navigation”, in: Acta Acustica United With Acustica 104(6), pp. 1075—1093, 2018, doi: 10.3813/AAA.919273.

 

Conference Papers

Ziemer, Tim, Höring, Thomas, Meirose, Lukas & Schultheis, Holger, “Monophonic Sonification for Spatial Navigation”, International Workshop on Informatics (IWIN) 2019, Hamburg, Sep 2019, URL: http://www.infsoc.org/conference/iwin2019/download/IWIN2019-Proceedings.pdf#page=95. [Keynote Speech]

Ismailogullari, Abdullah & Ziemer, Tim, “Soundscape Clock: Soundscape Compositions That Display the Time of the Day”, in: 25thInternational Conference on Auditory Display (ICAD2019), Newcastle upon Tyne, Jun 2019, doi: 10.21785/icad2019.034.

Schwarz, Sebastian & Ziemer, Tim, “A Psychoacoustic Sound Design for Pulse Oximetry”, in: 25thInternational Conference on Auditory Display (ICAD2019), Newcastle upon Tyne, Jun 2019, doi: 10.21785/icad2019.024.

Ziemer, Tim & Schultheis, Holger, “Psychoacoustical Signal Processing for Three-Dimensional Sonification”, in: 25th International Conference on Auditory Display (ICAD2019), Newcastle upon Tyne, Jun 2019, doi: 10.21785/icad2019.018.

Ziemer, Tim & Schultheis, Holger, “A Psychoacoustic Auditory Display for Navigation”, in: 24th International Conference on Auditory Display (ICAD2018), Houghton (MI): Jun 2018, pp. 136—144, doi: 10.21785/icad2018.007. [Hyundai Motors Best Paper Award]

Ziemer, Tim & Schultheis, Holger, “Perceptual Auditory Display for Two-Dimensional Short-Range Navigation”, in: Fortschritte der Akustik – DAGA 2018, Munich 2018, pp. 1094—1096, URL: https://www.dega-akustik.de/publikationen/online-proceedings/.

Ziemer, Tim, Black, David & Schultheis, Holger, “Psychoacoustic Sonification Design for Navigation in Surgical Interventions”, in: Proceedings of Meetings on Acoustics (POMA) 30(1), 2017, Paper-Number: 050005, doi: 10.1121/2.0000557.

 

Conference and Journal Abstracts

Ziemer, Tim & Schultheis, Holger, “Psychoakustische Sonifikation”, Jahrestagung der Gesellschaft für Musikforschung 2019, Detmold/Paderborn: Sep 2019, pp. 78–79, URL: https://www.muwi-detmold-paderborn.de/fileadmin/muwi/GfM/Programmheft_GfM2019_Web.pdf#page=80. [Invited Talk]

Ziemer, Tim & Schultheis, Holger, “Perceptual Auditory Display for Two-Dimensional Short-Range Navigation”, in: Programmheft der 44. Tagung für Akustik (DAGA), Munich: Mar 2018, p. 306, URL: http://2018.daga-tagung.de/fileadmin/2018.daga-tagung.de/programm/Feb_16_Programmheft.pdf.

Schultheis, Holger & Ziemer, Tim, “An auditory display for representing two-dimensional space”, in: Spatial Cognition 2018, Tuebingen, Sep 2018, https://www.researchgate.net/publication/340050716_An_auditory_display_for_representing_two-dimensional_space_Abstract. [Best Poster Award]

Black, David, Ziemer, Tim, Rieder, Christian, Hahn, Horst & Kikinis, Ron, “Auditory Display for Supporting Image-Guided Medical Instrument Navigation in Tunnel-like Scenarios”, in: Proceedings of the 3rd Conference on Image-Guided Interventions & Fokus Neuroradiologie (IGIC), Magdeburg: Nov 2017.

Ziemer, Tim & Black, David, “Psychoacoustic sonification for tracked medical instrument guidance”, in: The Journal of the Acoustical Society of America (JASA) 141(5), 2017, p. 3694, doi: 10.1121/1.4988051.

 

The project is funded by the Central Research Development Fund (CRDF). I am not responsible for the content of links to external sources.

 

Back to the Top of the Page