首页> 外文期刊>SIGKDD explorations >Generating Better Search Engine Text Advertisements with Deep Reinforcement Learning
【24h】

Generating Better Search Engine Text Advertisements with Deep Reinforcement Learning

机译:通过深入加强学习生成更好的搜索引擎文本广告

获取原文
获取原文并翻译 | 示例
           

摘要

Deep Reinforcement Learning has been applied in a number of fields to directly optimize non-differentiable reward functions, including in sequence to sequence settings using Self Critical Sequence Training (SCST). Previously, SCST has primarily been applied to bring conditional language models closer to the distribution of their training set, as in traditional neural machine translation and abstractive summarization. We frame the generation of search engine text ads as a sequence to sequence problem, and consider two related goals: to generate ads similar to those a human would write, and to generate ads with high click-through rates. We jointly train a model to minimize cross-entropy on an existing corpus of Landing Page/Text Ad pairs using typical sequence to sequence training techniques while also optimizing the expected click-through rate (CTR) as predicted by an existing oracle model using SCST. Through joint training we achieve a 6.7% increase in expected CTR without a meaningful drop in ROUGE score. Human experiments demonstrate that SCST training produces significantly more attractive ads without reducing grammatical quality.
机译:深度加强学习已应用于许多字段,以直接优化非可分子奖励功能,包括使用自我关键序列训练(SCST)依次进行序列设置。以前,SCST主要应用于将条件语言模型带入较近其培训集的分布,如传统的神经机翻译和抽象摘要。我们将搜索引擎文本广告的生成作为序列问题的序列,并考虑两个相关的目标:生成类似于人类的广告,并生成具有高点击率的广告。我们共同训练模型,以最大限度地使用典型的序列来最小化现有的登陆页面/文本广告对的跨熵,以序列训练技术,同时还可以使用SCST的现有Oracle模型预测的预期点击率(CTR)。通过联合培训,我们在没有有意义的胭脂分数下降的情况下达到6.7%。人类实验表明,SCST培训在不降低语法质量的情况下产生显着吸引力的广告。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号