...
首页> 外文期刊>WSEAS Transactions on Mathematics >Adopting some good practices to avoid overfitting in the use of Machine Learning
【24h】

Adopting some good practices to avoid overfitting in the use of Machine Learning

机译:采用一些良好的做法来避免在使用机器学习时过度装备

获取原文
获取原文并翻译 | 示例

摘要

In Machine Learning, different techniques, methods and algorithms are applied in order to a better approach for the problem that is solving. Adaptive learning, self-organization of information, generalization, fault tolerance and real-time operation are some of the most used in this field. These systems are dynamic and they can learn from the data adapting to the nature of the information. But an excessive adaptation or improvement of the response to the training data can lead to a poor generalization in many cases. Excessive training with the same set of data will cause the classification curves to over-detail the formal variations of that set. To avoid this overfitting, certain preventions can be taken. One possible option is to use the regularization technique keeping all the variables. This technique works well when we have many input parameters and each contributes "a little" in the prediction. We can conclude that the number of input features compared with the number of training samples, is really important to avoid overfitting.
机译:在机器学习中,应用不同的技术,方法和算法,以便更好地求解问题。自适应学习,信息,泛化,容错和实时操作的自适应学习,是该字段中最使用的一些。这些系统是动态的,它们可以从适应信息的性质中学习。但对训练数据的响应过度适应或改善可能导致许多情况下的普遍性差。使用相同一组数据进行过多的培训将导致分类曲线过度详细说明该集合的正式变体。为避免这种过度装备,可以采取某些预防。一种可能的选择是使用正则化技术保留所有变量。当我们有许多输入参数时,这种技术运行良好,并且每个都在预测中贡献“一点点”。我们可以得出结论,输入特征的数量与训练样本数相比,避免过度装备非常重要。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号