Action Word Prediction for Neural Source Code Summarization
Sakib Haque, Aakash Bansal, et al.
SANER 2021
Text style transfer rephrases a text from a source style (e.g., informal) to a target style (e.g., formal) while keeping its original meaning. Despite the success existing works have achieved using a parallel corpus for the two styles, transferring text style has proven significantly more challenging when there is no parallel training corpus. In this paper, we address this challenge by using a reinforcement-learning-based generator-evaluator architecture. Our generator employs an attention-based encoder-decoder to transfer a sentence from the source style to the target style. Our evaluator is an adversarially trained style discriminator with semantic and syntactic constraints that score the generated sentence for style, meaning preservation, and fluency. Experimental results on two different style transfer tasks (sentiment transfer and formality transfer) show that our model outperforms state-of-the-art approaches. Furthermore, we perform a manual evaluation that demonstrates the effectiveness of the proposed method using subjective metrics of generated text quality.
Sakib Haque, Aakash Bansal, et al.
SANER 2021
Zhen Zhang, Yijian Xiang, et al.
NeurIPS 2019
Kai Shen, Lingfei Wu, et al.
IJCAI 2020
Paola Cascante-Bonilla, Xuwang Yin, et al.
NAACL 2019