Session

From SHAP to EBM: Explain your Gradient Boosting Models in Python

Imagine you’ve just developed a credit rating model for a bank. The backtesting metrics look fantastic, and you get the green light to go live. But then, the first loan applications start rolling in, and suddenly your model’s decisions are under scrutiny.

Why was this application rejected? Why was that one approved? You must have answers—and fast.

This session has you covered.

We’ll dive into SHAP (SHapley Additive exPlanations) and EBM (Explainable Boosting Machine), two widely used methods for interpreting tree-based ensemble models like XGBoost. You’ll learn about their theory, strengths, and limitations, as well as see them in action with Python examples using the `shap` and `interpret-ml` libraries.

From understanding feature contributions to hands-on coding, this talk will equip you with practical tools to make complex models transparent, understandable, and ready for critical applications.

First delivered at Swiss Python Summit 2024, Zurich, Switzerland
Also held at Kaggle Days Milan 2024, Italy
Recorded session at: https://youtu.be/hnZjw77-1rE?si=9iz2KXMBoIDPQ-9a

Emanuele Fabbiani

Head of AI at xtream, Professor at Catholic University of Milan

Milan, Italy

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top