Kernel methods match deep neural networks on TIMIT
Po-Sen Huang, Haim Avron, et al.
ICASSP 2014
We present novel bounds on the classification error which are based on the f-Divergence and, at the same time, can be used as practical training criteria. There exist virtually no studies which investigate the link between the f-Divergence, the classification error and practical training criteria. So far only the Kullback-Leibler f-Divergence has been examined in this context to formulate a bound on the classification error and to derive the cross-entropy criterion. We extend this concept to a larger class of f-Divergences. We also successfully investigate if the novel training criteria based on the f-Divergence are suited for frame-wise training of deep neural networks on the Babel Vietnamese and Bengali speech recognition tasks. © 2014 IEEE.
Po-Sen Huang, Haim Avron, et al.
ICASSP 2014
Dmitry Malioutov, Aleksandr Aravkin
ICASSP 2014
Hagen Soltau, George Saon, et al.
ICASSP 2014
Xiaodong Cui, Vaibhava Goel, et al.
INTERSPEECH 2013