Session
From Prediction to Intuition: Explainable AI with Counterfactuals and Genetic Search
Counterfactual explanations — answering the question “what would need to change for a different outcome?” — are among the most powerful tools in the Explainable AI toolbox. They bridge the gap between abstract model reasoning and actionable insights. In this talk, we go beyond conventional methods and explore how genetic algorithms can evolve counterfactuals that are both realistic and actionable, offering fresh ways to understand data and model behavior.
Drawing from real-world scenarios and code examples using the German Credit Risk dataset, we’ll demonstrate how to:
* Use genetic algorithms to search for minimal, plausible input changes that flip model predictions.
* Evaluate and constrain counterfactuals for realism and interpretability.
* Detect potential model flaws and dataset biases through systematic “what-if” analysis.
Key Takeaways:
* Generate counterfactual explanations with genetic algorithms to enhance transparency and trust.
* Reveal model weaknesses and dataset flaws through structured “what-if” analysis.
* Integrate counterfactual techniques into real-world AI workflows with practical Python examples.
Audience:
This session is ideal for data scientists, ML practitioners, and AI educators who want practical, optimization-driven tools for explaining black-box models. Whether you’re designing responsible models, auditing decisions, or teaching interpretability, you’ll leave with strategies to evolve your explanations — literally.
Level: Intermediate
Keywords: Explainable AI, Responsible AI, Counterfactuals, Genetic Algorithms, Model Interpretability, Python, Optimization
Based on my book 'Hands-On Genetic Algorithms with Python', 2nd edition

Eyal Wirsansky
Senior data scientist, Artificial Intelligence mentor
Jacksonville, Florida, United States
Links
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top