首页> 外文会议>Annual Conference of the International Speech Communication Association >Combining CNN and BLSTM to Extract Textual and Acoustic Features for Recognizing Stances in Mandarin Ideological Debate Competition
【24h】

Combining CNN and BLSTM to Extract Textual and Acoustic Features for Recognizing Stances in Mandarin Ideological Debate Competition

机译:结合CNN和BLSTM提取文本和声学特征,以识别普通话思想争论竞争中的立场

获取原文

摘要

Recognizing stances in ideological debates is a relatively new and challenging problem in opinion mining. While previous work mainly focused on text modality, in this paper, we try to recognize stances from both text and acoustic modalities, where how to derive more representative textual and acoustic features still remains the research problem. Inspired by the promising performances of neural network models in natural language understanding and speech processing, we propose a unified framework named C-BLSTM by combining convolutional neural network (CNN) and bidirectional long short-term memory (BLSTM) recurrent neural network (RNN) for feature extraction. In C-BLSTM, CNN is utilized to extract higher-level local features of text (n-grams) and speech (emphasis, intonation), while BLSTM is used to extract bottleneck features for context-sensitive feature compression and target-related feature representation. Maximum entropy model is then used to recognize stances from the bimodal textual acoustic bottleneck features. Experiments on four debate datasets show C-BLSTM outperforms all challenging baseline methods, and specifically, acoustic intonation and emphasis features further improve F1-measure by 6% as compared to textual features only.
机译:识别思想辩论中的立场是意见采矿中的一个相对较新和具有挑战性的问题。虽然以前的工作主要专注于文本模型,但在本文中,我们试图从文本和声学模式中识别出色的立场,其中如何获得更多代表性的文本和声学功能仍然是研究问题。灵感来自自然语言理解和语音处理中神经网络模型的有希望的表演,我们通过组合卷积神经网络(CNN)和双向长期内记忆(BLSTM)复发神经网络(RNN)提出了一个名为C-BLSTM的统一框架用于特征提取。在C-BLSTM中,CNN用于提取文本(N-GRAM)和语音(强调,语调)的更高级别的本地特征,而BLSTM用于提取上下文敏感特征压缩和目标相关的特征表示的瓶颈特征。然后使用最大熵模型来识别来自双峰性声学瓶颈特征的阶段。四个辩论数据集的实验显示了C-BLSTM优于所有具有挑战性的基线方法,具体而言,与只有文本特征相比,声学语调和强调功能进一步提高了F1-Measure达6%。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号