Jey Han Lau, Alexander Clark, et al.
Cognitive Science
Language models are typically applied at the sentence level, without access to the broader document context. We present a neural language model that incorporates document context in the form of a topic model-like architecture, thus providing a succinct representation of the broader document context outside of the current sentence. Experiments over a range of datasets demonstrate that our model outperforms a pure sentence-based model in terms of language model perplexity, and leads to topics that are potentially more coherent than those produced by a standard LDA topic model. Our model also has the ability to generate related sentences for a topic, providing another way to interpret topics.
Jey Han Lau, Alexander Clark, et al.
Cognitive Science
Mingbo Ma, Liang Huang, et al.
ACL 2017
Khoi Nguyen Tran, Jey Han Lau, et al.
EDM 2018
Mo Yu, Wenpeng Yin, et al.
ACL 2017