Estimation de probabilité non-paramétrique pour la reconnaissance markovienne de la parole
- F. Lefèvre
- 213 pages
- 17.01.2000- document en - http://www.lip6.fr/lip6/reports/2000/lip6.2000.006.ps.gz - 1,035 Ko
- Kontakt : Fabrice.Lefevre (at) nulllia.univ-avignon.fr
Ancien Thème :
The Hidden Markov Models (HMM) account for the most outstanding evolutions made during the last period in continuous speech recognition. The HMM model the speech temporal distortions assisted by probability density functions (pdf) for the modelling of the distortions in frequency. We propose to improve their performance by means of the K-Nearest Neighbours pdf estimator. This estimator presents a low estimation error (near the optimal lower bound) and is discriminative.
The K-NN estimator is evaluated as a speech short-term spectra recognition operator. Its performances are compared with those of the state-of-the-art estimator based on mixtures of gaussian functions. Thereafter, the adaptations involved by its integration in an HMM-based system are developed. An optimal training procedure is obtained from a new version of the EM algorithm which convergence we show according to the Maximum Likelihood criterion.
This study leads to the development of a K-NN HMM system which is evaluated on the TIMIT database and compared with a Gaussian HMM system. Afterward, two approaches are studied in order to incorporate knowledge in the system : the introduction of a temporal information in the representation space and the adaptation of the references.
The K-NN HMM system performances are encouraging. Further studies will be needed to reach the state-of-the-art performances. Besides, the K-NN estimator has the advantage to offer an alternative to the state-of-the-art estimator and thus will be very helpful for revealing, by comparison, the real influence of the pdfs in the HMM recognition systems.
- Schlüsselworte : Continuous speech recognition, Hidden Markov Model, K-nearest neighbors, EM algorithm
- Publikationsleiter : Valerie.Mangin (at) nulllip6.fr