首页> 外文期刊>Knowledge and information systems >Controlled permutations for testing adaptive learning models
【24h】

Controlled permutations for testing adaptive learning models

机译:用于测试自适应学习模型的受控排列

获取原文
获取原文并翻译 | 示例
           

摘要

We study evaluation of supervised learning models that adapt to changing data distribution over time (concept drift). The standard testing procedure that simulates online arrival of data (test-then-train) may not be sufficient to generalize about the performance, since that single test concludes how well a model adapts to this fixed configuration of changes, while the ultimate goal is to assess the adaptation to changes that happen unexpectedly. We propose a methodology for obtaining datasets for multiple tests by permuting the order of the original data. A random permutation is not suitable, as it makes the data distribution uniform over time and destroys the adaptive learning task. Therefore, we propose three controlled permutation techniques that make it possible to acquire new datasets by introducing restricted variations in the order of examples. The control mechanisms with theoretical guarantees of preserving distributions ensure that the new sets represent close variations of the original learning task. Complementary tests on such sets allow to analyze sensitivity of the performance to variations in how changes happen and this way enrich the assessment of adaptive supervised learning models.
机译:我们研究有监督的学习模型的评估,该模型适应随时间变化的数据分布(概念漂移)。模拟数据在线到达(测试然后训练)的标准测试程序可能不足以概括性能,因为单个测试可以得出模型对这种固定的变更配置的适应程度,而最终目标是评估对意外发生的变化的适应性。我们提出了一种通过置换原始数据的顺序来获取多个测试的数据集的方法。随机排列是不合适的,因为它会使数据随时间分布均匀并且破坏自适应学习任务。因此,我们提出了三种受控置换技术,这些技术可以通过按示例顺序引入受限制的变异来获取新的数据集。具有保留分布理论保证的控制机制可确保新集合代表原始学习任务的紧密变化。通过对此类集合进行补充测试,可以分析性能对变化发生方式变化的敏感性,从而丰富了对自适应监督学习模型的评估。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号