Publication: An in-depth analysis of KernelSHAP and SamplingSHAP: assessing robustness, error, and efficiency
No Thumbnail Available
Date
2025
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Springer Science and Business Media Deutschland GmbH
Abstract
The growing importance of explainable artificial intelligence (XAI) has brought SHAP (SHapley Additive exPlanations) to the forefront as one of the most widely adopted model-agnostic explanation frameworks. Among its variants, KernelSHAP and SamplingSHAP are two prominent methods that rely on different sampling-based approximation techniques to estimate Shapley values. Despite their popularity, a systematic comparison between these two approaches—particularly in terms of their sampling mechanisms—remains limited. This study aims to fill this gap by providing a detailed comparative analysis of KernelSHAP and SamplingSHAP, with a focus on their robustness, error rates, and runtime efficiency under varying sampling strategies. The evaluation begins with an in-depth analysis of the KernelSHAP method, incorporating minor adjustments to its sampling mechanism. These adjustments include varying the number of samples, using different selection strategies, applying alternative sample weighting methods in the linear model, and adjusting the sampling intensity based on feature importance values. Following this focused analysis of KernelSHAP, a comprehensive comparison between KernelSHAP and SamplingSHAP is conducted. This comparison investigates how variations in sample size independently impact each method’s performance in terms of robustness, error, and runtime. The analyses are applied across multiple datasets, providing insight into each method’s operational efficiencies and potential limitations. In general, this research offers a comparative perspective on the practical efficiencies and trade-offs of each explanatory approach, guiding the selection of suitable explanation mechanisms to address the diverse analytical requirements in XAI. © 2025 Elsevier B.V., All rights reserved.
