Soundscape visualization: a new approach based on automatic annotation and Samocharts
Regular paper
IRIT
Tuesday 2 june, 2015, 15:20 - 15:40
0.6 Madrid (49)
Abstract:
The visualization of sounds facilitates their identification and
classification. However, in the case of audio recording websites, the access
to a sound is usually based on the metadata of the sounds, i.e. sources and
recording conditions. As sonic environments, or soundscapes, are mostly
composed of multiples sources, their compact description is an issue that
makes difficult the choice of an item in a sound corpus. The time-component
matrix chart, which is abbreviated as TM-chart, has been proposed recently as
a tool to describe and compare sonic environments. However their process of
creation is based on a subjective annotation that makes their creation time-
consuming. In this paper, we present a new method for urban soundscape corpus
visualization. In the context of the CIESS project, we propose Samochart: an
extension of the TM-chart that is based on sound detection algorithms. We
describe three original algorithms that allow the detection of alarms,
footsteps, and motors. Samocharts can be computed from the results of these
algorithms. This process is applied to a concrete case study: 20 urban
recordings of 5 minutes each, from different situations (places and time). An
application case shows that Samocharts allow an identification of different
situations. Finally, the whole method provides a low-cost tool for soundscape
visualization that can easily be applied to the management and use of a sound
corpus.