EPAComp: An Architectural Model for EPA Composition
Luís Henrique Neves Villaça, Sean Wolfgand Matsui Siqueira, et al.
SBSI 2023
This paper presents a new probability table memory compression method based, on mixture models and its application to N-tuple recognizers and N-gram character language models. Joint probability tables are decomposed into lower dimensional probability components and their mixtures. The maximum likelihood parameters of the mixture models are trained by the Expectation Maximization (EM) algorithm and quantized to one byte integers. Probability elements that mixture models do not estimate reliably are kept separately. Experimental results with on-line handwritten UNIPEN uppercase and lowercase characters show that the total memory size of an on-line scanning N-tuple recognizer is reduced from 12.3MB to 0.66MB bytes, while the recognition rate drops from 91.64% to 91.13% for uppercase characters and from 88-44% to 87.31% for lowercase characters. The N-gram character language model was compressed from 73.6MB to 0.58MB with minimal reduction in performance.
Luís Henrique Neves Villaça, Sean Wolfgand Matsui Siqueira, et al.
SBSI 2023
M. Abe, M. Hori
SAINT 2003
Xiaodan Song, Ching-Yung Lin, et al.
CVPRW 2004
Jia Cui, Yonggang Deng, et al.
ASRU 2009