Low-Resource Speech Recognition of 500-Word Vocabularies
Sabine Deligne, Ellen Eide, et al.
INTERSPEECH - Eurospeech 2001
A recent series of papers [1, 2, 3, 4] introduced Subspace Constrained Gaussian Mixture Models (SCGMMs) and showed that SCGMMs can very efficiently approximate Full Covariance Gaussian Mixture Models (FCGMMs); a significant reduction in the number of parameters is achieved with little loss in the accuracy of the model. SCGMMs were arrived at as a sequence of generalizations of diagonal covariance GMMs. As an artifact of this process the initialization of SCGMM parameters in that work is complex i.e., relies on best parameter settings of less general models. This paper overcomes this problem by showing how an FCGMM can be used to give a simple and direct initialization of an SCGMM. The initialization scheme is powerful enough that as the number of parameters in an SCGMM approaches that of an FCGMM (i.e., large SCGMMs) further training of the SCGMM is unnecessary. © 2005 IEEE.
Sabine Deligne, Ellen Eide, et al.
INTERSPEECH - Eurospeech 2001
Jia-Yu Chen, Peder A. Olsen, et al.
INTERSPEECH 2007
Karthik Visweswariah, Sanjeev Kulkarni, et al.
IEEE International Symposium on Information Theory - Proceedings
Amit Singh, Karthik Visweswariah
CIKM 2011