In this work, we propose contextual language models that incorporate dialoglevel discourse information into language modeling. Previous works oncontextual language model treat preceding utterances as a sequence of inputs,without considering dialog interactions. We design recurrent neural network(RNN) based contextual language models that specially track the interactionsbetween speakers in a dialog. Experiment results on Switchboard Dialog ActCorpus show that the proposed model outperforms conventional single turn basedRNN language model by 3.3% on perplexity. The proposed models also demonstrateadvantageous performance over other competitive contextual language models.
展开▼