Access count of this item: 118

Files in This Item:
File Description SizeFormat 
1687-4722-2012-6.pdf1.19 MBAdobe PDFView/Open
Title: A multi-modal tempo and beat tracking system based on audio-visual information from live guitar performances
Authors: Itohara, Tatsuhiko
Otsuka, Takuma
Muzumoto, Takeshi
Lim, Angelica
Ogata, Tetsuya
Okuno, Hiroshi G.
Author's alias: 奥乃, 博
Issue Date: 20-Jan-2012
Publisher: SpringerOpen
Journal title: EURASIP Journal on Audio, Speech, and Music Processing
Volume: 2012
Thesis number: 6
Abstract: The aim of this paper is to improve beat-tracking for live guitar performances. Beat-tracking is a function to estimate musical measurements, for example musical tempo and phase. This method is critical to achieve a synchronized ensemble performance such as musical robot accompaniment. Beat-tracking of a live guitar performance has to deal with three challenges: tempo fluctuation, beat pattern complexity and environmental noise. To cope with these problems, we devise an audiovisual integration method for beat-tracking. The auditory beat features are estimated in terms of tactus (phase) and tempo (period) by Spectro-Temporal Pattern Matching (STPM), robust against stationary noise. The visual beat features are estimated by tracking the position of the hand relative to the guitar using optical flow, mean shift and the Hough transform. Both estimated features are integrated using a particle filter to aggregate the multimodal information based on a beat location model and a hand's trajectory model. Experimental results confirm that our beat-tracking improves the F-measure by 8.9 points on average over the Murata beat-tracking method, which uses STPM and rule-based beat detection. The results also show that the system is capable of real-time processing with a suppressed number of particles while preserving the estimation accuracy. We demonstrate an ensemble with the humanoid HRP-2 that plays the theremin with a human guitarist.
Rights: © 2012 Itohara et al; licensee Springer.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
URI: http://hdl.handle.net/2433/187381
DOI(Published Version): 10.1186/1687-4722-2012-6
Appears in Collections:Journal Articles

Show full item record

Export to RefWorks


Export Format: 


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.