...
首页> 外文期刊>Information systems >A quantitative approach for the comparison of additive local explanation methods
【24h】

A quantitative approach for the comparison of additive local explanation methods

机译:A quantitative approach for the comparison of additive local explanation methods

获取原文
获取原文并翻译 | 示例

摘要

Local additive explanation methods are increasingly used to understand the predictions of complex Machine Learning (ML) models. The most used additive methods, SHAP and LIME, suffer from limitations that are rarely measured in the literature. This paper aims to measure these limitations on a wide range (304) of OpenML datasets using six quantitative metrics, and also evaluate emergent coalitional-based methods to tackle the weaknesses of other methods. We illustrate and validate results on a specific medical dataset, SA-Heart. Our findings reveal that LIME and SHAP's approximations are particularly efficient in high dimension and generate intelligible global explanations, but they suffer from a lack of precision regarding local explanations and possibly unwanted behavior when changing the method's parameters. Coalitional-based methods are computationally expensive in high dimension, but offer higher quality local explanations. Finally, we present a roadmap summarizing our work by pointing out the most appropriate method depending on dataset dimensionality and user's objectives.(c) 2023 Elsevier Ltd. All rights reserved.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号