Loughborough University
Browse

Supplementary information files for "A meta-heuristic approach to estimate and explain classifier uncertainty"

Download (244.37 kB)
dataset
posted on 2025-05-12, 10:03 authored by Andrew Houston, Georgina CosmaGeorgina Cosma

Supplementary files for article "A meta-heuristic approach to estimate and explain classifier uncertainty"

Trust is a crucial factor affecting the adoption of machine learning (ML) models. Qualitative studies have revealed that end-users, particularly in the medical domain, need models that can express their uncertainty in decision-making allowing users to know when to ignore the model’s recommendations. However, existing approaches for quantifying decision-making uncertainty are not model-agnostic, or they rely on complex mathematical derivations that are not easily understood by laypersons or end-users, making them less useful for explaining the model’s decision-making process. This work proposes a set of class-independent meta-heuristics that can characterise the complexity of an instance in terms of factors that are mutually relevant to both human and ML decision-making. The measures are integrated into a meta-learning framework that estimates the risk of misclassification. The proposed framework outperformed predicted probabilities and entropy-based methods of identifying instances at risk of being misclassified. Furthermore, the proposed approach resulted in uncertainty estimates that proves more independent of model accuracy and calibration than existing approaches. The proposed measures and framework demonstrate promise for improving model development for more complex instances and provides a new means of model abstention and explanation.

© The Author(s), CC BY 4.0

Funding

DMRC

Loughborough University

History

School

  • Science

Usage metrics

    Computer Science

    Categories

    No categories selected

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC