Back Seminar by Masataka Goto on Active Listening Interfaces

Seminar by Masataka Goto on Active Listening Interfaces

01.12.2008

 

Masataka Goto, senior research scientist at AIST, will be giving a research seminar on December 9th 2008 at 11am on the França Auditorium with the title "Active Music Listening Interfaces Based on Music-Understanding Technologies".

Abstract:

People who can actively interact with music have traditionally been considered musicians in the sense that "actively" implies the creation of music. On the other hand, ordinary people have been just listeners who could interact with music only passively. When the recording of music to audio storage media became a reality, however, some people started interacting with music in more active ways, for example, by specifying the playback order of songs or adjusting frequency characteristics by using tone controls. Recent advances in computer and music-understanding technologies will further affect how people interact with music.

In this seminar, I will introduce our research aimed at building "Active Music Listening Interfaces" [M. Goto, Proc. of IEEE ICASSP 2007] to demonstrate the importance of music-understanding technologies and the benefit they offer to ordinary people (end users). Active music listening is a way of listening to music through active interactions. Given polyphonic sound mixtures taken from available music recordings, our interfaces enrich end-users' music listening experiences by applying our automatic music-understanding technologies based on signal processing. In this research, "active" does not mean the creation of new music, but any active experience that is part of enjoying music. For example, our active music listening interface with a chorus-search function, "SmartMusicKIOSK" [M. Goto, IEEE Trans. ASLP, Vol.14, No.5, 2006], enables a user to skim rapidly through a musical piece by easily skipping sections of no interest while viewing a visual representation of music structure. During the playback of a song, "LyricSynchronizer" [H. Fujihara et al., Proc. of IEEE ISM 2006] with a lyrics synchronization function displays scrolling lyrics and highlights the phrase currently sung. A user can easily follow the current playback position and click on a word in the lyrics to listen to it. By suppressing drum sounds and adding other drum sounds, "Drumix" [K. Yoshii et al., IPSJ Journal, Vol.48, No.3, 2007] with a drum-sound recognition function enables a user to change the volume and timbre of drum sounds and rearrange rhythmic patterns of these drum sounds during playback. These interfaces can also be regarded as "Augmented Music-Understanding Interfaces" that facilitate deeper understanding of music by end users.

 

Dr. Goto's home page

Multimedia

Categories:

SDG - Sustainable Development Goals:

Els ODS a la UPF

Contact