首页> 外文期刊>Analytica chimica acta >Improved variable reduction in partial least squares modelling based on Predictive-Property-Ranked Variables and adaptation of partial least squares complexity
【24h】

Improved variable reduction in partial least squares modelling based on Predictive-Property-Ranked Variables and adaptation of partial least squares complexity

机译:基于预测性能等级变量的局部最小二乘模型改进的变量约简和局部最小二乘复杂度的自适应

获取原文
获取原文并翻译 | 示例
       

摘要

The calibration performance of partial least squares for one response variable (PLS1) can be improved by elimination of uninformative variables. Many methods are based on so-called predictive variable properties, which are functions of various PLS-model parameters, and which may change during the variable reduction process. In these methods variable reduction is made on the variables ranked in descending order for a given variable property. The methods start with full spectrum modelling. Iteratively, until a specified number of remaining variables is reached, the variable with the smallest property value is eliminated; a new PLS model is calculated, followed by a renewed ranking of the variables. The Stepwise Variable Reduction methods using Predictive-Property-Ranked Variables are denoted as SVR-PPRV. In the existing SVR-PPRV methods the PLS model complexity is kept constant during the variable reduction process. In this study, three new SVR-PPRV methods are proposed, in which a possibility for decreasing the PLS model complexity during the variable reduction process is build in. Therefore we denote our methods as PPRVR-CAM methods (Predictive-Property-Ranked Variable Reduction with Complexity Adapted Models). The selective and predictive abilities of the new methods are investigated and tested, using the absolute PLS regression coefficients as predictive property. They were compared with two modifications of existing SVR-PPRV methods (with constant PLS model complexity) and with two reference methods: uninformative variable elimination followed by either a genetic algorithm for PLS (UVE-GA-PLS) or an interval PLS (UVE-iPLS). The performance of the methods is investigated in conjunction with two data sets from near-infrared sources (NIR) and one simulated set. The selective and predictive performances of the variable reduction methods are compared statistically using the Wilcoxon signed rank test. The three newly developed PPRVR-CAM methods were able to retain significantly smaller numbers of informative variables than the existing SVR-PPRV, UVE-GA-PLS and UVE-iPLS methods without loss of prediction ability. Contrary to UVE-GA-PLS and UVE-iPLS, there is no variability in the number of retained variables in each PPRV(R) method. Renewed variable ranking, after deletion of a variable, followed by remodelling, combined with the possibility to decrease the PLS model complexity, is beneficial. A preferred PPRVR-CAM method is proposed.
机译:一个响应变量(PLS1)的偏最小二乘的校准性能可以通过消除非信息性变量来提高。许多方法基于所谓的预测变量属性,这些变量是各种PLS模型参数的函数,并且可能在变量减少过程中发生变化。在这些方法中,对给定变量属性按降序排列的变量进行变量归约。该方法从全光谱建模开始。迭代地,直到达到指定数量的剩余变量,才删除具有最小属性值的变量;计算新的PLS模型,然后更新变量的排名。使用预测属性排名变量的逐步变量约简方法表示为SVR-PPRV。在现有的SVR-PPRV方法中,PLS模型复杂度在变量减少过程中保持恒定。在这项研究中,提出了三种新的SVR-PPRV方法,其中建立了在变量减少过程中降低PLS模型复杂性的可能性。因此,我们将我们的方法称为PPRVR-CAM方法(预测性能等级变量减少)。复杂度调整模型)。使用绝对PLS回归系数作为预测属性,研究和测试了新方法的选择性和预测能力。将它们与现有SVR-PPRV方法的两种修改(具有恒定的PLS模型复杂度)和两种参考方法进行了比较:无信息变量消除,然后使用PLS遗传算法(UVE-GA-PLS)或区间PLS(UVE- iPLS)。结合来自近红外源(NIR)的两个数据集和一个模拟集对方法的性能进行了研究。使用Wilcoxon符号秩检验对变量减少方法的选择性和预测性能进行统计比较。与现有的SVR-PPRV,UVE-GA-PLS和UVE-iPLS方法相比,三种新开发的PPRVR-CAM方法能够保留的信息量要少得多,而不会损失预测能力。与UVE-GA-PLS和UVE-iPLS相反,每种PPRV(R)方法中保留变量的数量没有变化。在删除变量之后进行重新建模,再进行重新建模,再加上降低PLS模型复杂性的可能性,这是有益的。提出了一种优选的PPRVR-CAM方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号