Jihun Yun, Peng Zheng, et al.
ICML 2019
Recent work in natural language processing (NLP) has yielded appealing results from scaling model parameters and training data; however, using only scale to improve per¬formance means that resource consumption also grows. Such resources include data, time, storage, or energy, all of which are naturally limited and unevenly distributed. This mo¬tivates research into efficient methods that require fewer resources to achieve similar re¬sults. This survey synthesizes and relates cur¬rent methods and findings in efficient NLP. We aim to provide both guidance for con-ducting NLP under limited resources, and point towards promising research directions for developing more efficient methods.
Jihun Yun, Peng Zheng, et al.
ICML 2019
Conrad Albrecht, Jannik Schneider, et al.
CVPR 2025
Sashi Novitasari, Takashi Fukuda, et al.
INTERSPEECH 2025
Hagen Soltau, Lidia Mangu, et al.
ASRU 2011