Dinesh Raghu, Nikhil Gupta, et al.
Transactions of the Association for Computational Linguistics
The Knowledge Base (KB) used for real-world applications, such as booking a movie or restaurant reservation, keeps changing over time. End-to-end neural networks trained for these task-oriented dialogs are expected to be immune to any changes in the KB. However, existing approaches breakdown when asked to handle such changes. We propose an encoder-decoder architecture (BOSSNET) with a novel Bag-of-Sequences (BOSS) memory, which facilitates the disentangled learning of the response's language model and its knowledge incorporation. Consequently, the KB can be modified with new knowledge without a drop in interpretability. We find that BOSSNET outperforms state-of-the-art models, with considerable improvements (>10%) on bAbI OOV test sets and other human-human datasets. We also systematically modify existing datasets to measure disentanglement and show BOSSNET to be robust to KB modifications.
Dinesh Raghu, Nikhil Gupta, et al.
Transactions of the Association for Computational Linguistics
Dinesh Raghu, Surag Nair, et al.
IJCAI 2018
Guy Barash, Mauricio Castillo-Effen, et al.
AI Magazine
Paola Cascante-Bonilla, Xuwang Yin, et al.
NAACL 2019