Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
TRANSFORMING BLACK BOX MODELS INTO TRANSPARANT SYSTEMS THROUGH EXPLAINABLE AI METHODS
0
Zitationen
1
Autoren
2025
Jahr
Abstract
The fast integration of AI in key industries like healthcare, banking, and autonomous cars has led to a growing number of people looking for ways to understand and assure accountability for machine learning models. Users have a hard time accepting, trusting, and collaborating with black box models (e.g., deep neural networks and ensemble approaches) since they don't show how they make decisions, even when these models are good at producing predictions. There is a novel way to bridge this gap with explainable AI (XAI) solutions, which reduce complicated systems to more understandable and observable forms. This research delves into many XAI approaches, including tools for data visualization, models that are universally understandable, and model-agnostic methods like SHAP and LIME. The superior knowledge of feature importance, causal links, and decision pathways that XAI possesses allows for more fair algorithmic decision-making, more trustworthy results, and easier debugging. Then, it moves on to discuss topics like consistency, scalability, and the danger of oversimplification. Finding a middle ground between being clear and being honest is crucial. Explainable AI (XAI) is used to transform "black box" models into transparent systems. This lays the groundwork for the ethical deployment of AI in major real-world settings and allows humans and AI to collaborate.
Ähnliche Arbeiten
Grad-CAM: Visual Explanations from Deep Networks via Gradient-Based Localization
2017 · 20.562 Zit.
Generative Adversarial Nets
2023 · 19.892 Zit.
Visualizing and Understanding Convolutional Networks
2014 · 15.298 Zit.
"Why Should I Trust You?"
2016 · 14.384 Zit.
On a Method to Measure Supervised Multiclass Model’s Interpretability: Application to Degradation Diagnosis (Short Paper)
2024 · 13.164 Zit.