Přístupnostní navigace
E-přihláška
Vyhledávání Vyhledat Zavřít
Detail publikace
MOTLÍČEK, P., BURGET, L., ČERNOCKÝ, J.
Originální název
Non-parametric Speaker Turn Segmentation of Meeting Data
Typ
článek ve sborníku ve WoS nebo Scopus
Jazyk
angličtina
Originální abstrakt
An extension of conventional speaker segmentation framework is presented for a scenario in which a number of microphones record the activity of speakers present at a meeting (one microphone per speaker). Although each microphone can receive speech from both the participant wearing the microphone (local speech) and other participants (cross-talk), the recorded audio can be broadly classified in three ways: local speech, cross-talk, and silence. This paper proposes a technique which takes into account cross-correlations, values of its maxima, and energy differences as features to identify and segment speaker turns. In particular, we have used classical cross-correlation functions, time smoothing and in part temporal constraints to sharpen and disambiguate timing differences between microphone channels that may be dominated by noise and reverberation. Experimental results show that proposed technique can be successively used for speaker segmentation of data collected from a number of different setups.
Klíčová slova
speech processing, feature extraction, speaker detection, meeting data
Autoři
Rok RIV
2005
Vydáno
5. 9. 2005
Nakladatel
International Speech Communication Association
Místo
Lisabon
ISSN
1018-4074
Periodikum
European Conference EUROSPEECH
Ročník
Číslo
9
Stát
Švýcarská konfederace
Strany od
657
Strany do
660
Strany počet
4
URL
http://www.fit.vutbr.cz/~motlicek/publi/2005/eurospeech_2005.pdf
BibTex
@inproceedings{BUT18288, author="Petr {Motlíček} and Lukáš {Burget} and Jan {Černocký}", title="Non-parametric Speaker Turn Segmentation of Meeting Data", booktitle="Interspeech'2005 - Eurospeech - 9th European Conference on Speech Communication and Technology", year="2005", journal="European Conference EUROSPEECH", volume="2005", number="9", pages="657--660", publisher="International Speech Communication Association", address="Lisabon", issn="1018-4074", url="http://www.fit.vutbr.cz/~motlicek/publi/2005/eurospeech_2005.pdf" }