STAI CDT PhD student Jessica Lally recently presented her work at her first academic conference, the 4th Conference on Causal Learning and Reasoning (CLeaR 2025), held in Lausanne, Switzerland, in May.
CLeaR is dedicated to advancing research in causality and causal learning, bringing together leading experts from across the world. Jessica presented a poster based on the paper ‘Counterfactual Influence in Markov Decision Processes‘. She explains the research as follows:
“Counterfactual thinking is about exploring ‘what-if’ scenarios: what would have happened had we changed some decisions we made in the past. For example, a doctor might wonder how a patient’s recovery would have changed if they had prescribed a different treatment. We can apply the same idea to AI agents and consider how outcomes might have improved if the agent had taken different actions.
Our paper focuses on identifying which areas of the agent’s environment we can learn about from a given observation. By focusing on areas where we have reliable information, we can narrow the search for better counterfactual decision-making policies. This approach improves the safety and reliability of AI, especially in safety-critical real-world applications, like healthcare and transportation.”
Reflecting on the experience, Jessica said:
“Although I was a bit nervous about presenting, everyone was very welcoming and asked really thoughtful questions.”
Alongside her poster presentation, Jessica has recently co-authored another paper: Causal Temporal Reasoning for Markov Decision Processes. Research Directions: Cyber-Physical Systems (Volume 3, e3). As she describes:
“This paper explores how causal and counterfactual reasoning can be used to verify AI systems. By introducing a new kind of logic focused on cause and effect, we can assess how changing system settings (such as different AI policies) can improve the system’s safety and correctness, based on its observed behaviour.”
For Jessica, attending conferences is one of the most exciting and rewarding aspects of her PhD journey. She says she is “really enjoying [her] PhD so far,” especially appreciating “the opportunity to explore topics I care about and make meaningful contributions to keeping AI systems safe.”
She also credits the STAI CDT community for enriching her research experience:
“Being part of STAI CDT has been extremely rewarding as it has allowed me to connect with other PhD students working in causality and related fields, which I have found invaluable for discussions and collaboration.”
It’s exciting to hear about Jessica’s recent contributions to the research community. You can read the papers below:
Kazemi, M., Lally, J., Tishchenko, E., Chockler, H. & Paoletti, N. (2025). Counterfactual Influence in Markov Decision Processes. In Proceedings of the 4th Conference on Causal Learning and Reasoning (CLeaR 2025).
Kazemi, M., Lally, J. & Paoletti, N. (2025). Causal Temporal Reasoning for Markov Decision Processes. Research Directions: Cyber-Physical Systems. Volume 3, e3.