Dr. Heidy Khlaaf is a Principal Research Scientist at the AI Now Institute, specialising in the safety of AI within autonomous weapons systems. She has expertise in leading system safety audits and has helped establish the field of AI Safety Engineering. Khlaaf's work focuses on the evaluation, specification, and verification of complex or autonomous software implementations in safety-critical systems.
Khlaaf received her PhD in Computer Science (Formal Verification) from University College London in 2017, advised by Nir Piterman. Her research focused on the temporal verification, termination, and non-termination of infinite-state software systems. She also holds a Bachelor of Science in Computer Science and Philosophy, with a minor in Mathematics, from Florida State University, graduating with honours and highest distinction.
Dr. Khlaaf currently works at the AI Now Institute, contributing to the assessment and safety of AI in autonomous weapons systems. Prior to this, she held the following positions:
Khlaaf has published several notable works, including:
She has also been featured in various media outlets, including TIME, NPR, Politico, Vox, and WIRED, providing commentary and insights on AI-related topics.
Khlaaf received the prestigious NSF GRFP award and the Best Paper Award at CAV 2015 for her work on the automated verification of infinite-state systems. She has presented at numerous conferences and events, including SRECon, Papers We Love, and F# eXchange.
Heidy Khlaaf is a Principal Research Scientist at the AI Now Institute, where she focuses on the assessment and safety of AI within autonomous weapons systems. She previously worked as the Engineering Director of ML Assurance at the cybersecurity firm Trail of Bits.
Khlaaf received a Bachelor of Science from Florida State University with dual degrees in Computer Science and Philosophy and a minor in Mathematics, graduating with honours and highest distinction. She went on to complete a Computer Science PhD (Formal Verification) at University College London in 2017, where her work focused on the temporal verification, termination, and non-termination of infinite-state software systems.
Khlaaf has an extensive and broad range of expertise in leading system safety audits, from UAVs to large nuclear power plants, contributing to the construction of safety cases for safety-critical software. She has helped establish the field of AI Safety Engineering and led the safety evaluation of Codex at OpenAI, developing a framework that measures a model's performance outcomes against a cross-functional risk assessment. This framework is now a de facto methodology used across AI labs.
At Trail of Bits, Khlaaf led the cyber evaluations as part of the launch of the UK AI Safety Institute and unveiled the LeftoverLocals vulnerability. Her unique expertise at the intersection of Systems Software Engineering and Machine Learning has allowed her to lead and contribute to the development of standards and auditing frameworks for safety-related applications and their development, including policy and regulatory frameworks for US and UK regulators.
Khlaaf is currently part of the Network of Experts for the UNSG's AI Advisory Body and an ISO SC 42 (Artificial Intelligence) Committee Member via the British Standards Institute. She has been featured in several prominent media outlets, including TIME, NPR, Politico, Vox, and WIRED, and has received recognition for her research, such as the prestigious NSF GRFP award and a best paper award at CAV 2015.
Youtube Title: "Standards We Love" by Heidy Khlaaf [PWLConf 2018]
Youtube Link: link
Youtube Channel Name: PapersWeLove
Youtube Channel Link: https://www.youtube.com/@PapersWeLove
!"Standards We Love" by Heidy Khlaaf [PWLConf 2018]
Youtube Title: SREcon19 Europe/Middle East/Africa - Applicable and Achievable Formal Verification
Youtube Link: link
Youtube Channel Name: USENIX
Youtube Channel Link: https://www.youtube.com/@UsenixOrg
SREcon19 Europe/Middle East/Africa - Applicable and Achievable Formal Verification
Youtube Title: What we've learned from the women behind the AI revolution | Equity Podcast
Youtube Link: link
Youtube Channel Name: TechCrunch
Youtube Channel Link: https://www.youtube.com/@TechCrunch
What we've learned from the women behind the AI revolution | Equity Podcast
Youtube Title: A Case for Correctly Rounded Math Libraries
Youtube Link: link
Youtube Channel Name: PapersWeLove
Youtube Channel Link: https://www.youtube.com/@PapersWeLove
A Case for Correctly Rounded Math Libraries
Youtube Title: Creative computing for high performance architecture
Youtube Link: link
Youtube Channel Name: The Institution of Structural Engineers
Youtube Channel Link: https://www.youtube.com/@theinstitutionofstructural2470
Creative computing for high performance architecture
Youtube Title: Breaking the Wall of Challenging Climate Change Communication
Youtube Link: link
Youtube Channel Name: Falling Walls Foundation
Youtube Channel Link: https://www.youtube.com/@FallingWallsFoundation
Breaking the Wall of Challenging Climate Change Communication
Youtube Title: Office Hours featuring "Aging & Engaging" Hackathon leader Heidi Culbertson
Youtube Link: link
Youtube Channel Name: Alexa Developers
Youtube Channel Link: https://www.youtube.com/@AlexaDevelopers
Office Hours featuring "Aging & Engaging" Hackathon leader Heidi Culbertson
Youtube Title: Why do we need a coding standard in Software Development for Safety-Critical Environments
Youtube Link: link
Youtube Channel Name: PRQA
Youtube Channel Link: https://www.youtube.com/@prqatools
Why do we need a coding standard in Software Development for Safety-Critical Environments
Youtube Title: Papers We Love too - Probabilistic Accuracy Bounds
Youtube Link: link
Youtube Channel Name: Fastly
Youtube Channel Link: https://www.youtube.com/@Fastly-company
Papers We Love too - Probabilistic Accuracy Bounds
Youtube Title: A Video Conversation with Heidi Klotzman, Founder and CEO of HeidnSeek Entertainment- Part 2 of 4
Youtube Link: link
Youtube Channel Name: Offit Kurman, P.A.
Youtube Channel Link: https://www.youtube.com/@OffitKurmanLaw
A Video Conversation with Heidi Klotzman, Founder and CEO of HeidnSeek Entertainment- Part 2 of 4
Youtube Title: Michael Pigott on Toward a Generic Fault Tolerance Technique [PWL NYC]
Youtube Link: link
Youtube Channel Name: PapersWeLove
Youtube Channel Link: https://www.youtube.com/@PapersWeLove
!Michael Pigott on Toward a Generic Fault Tolerance Technique [PWL NYC]
Youtube Title: Douglas Creager, Pastry (Papers We Love BOS, July 2018)
Youtube Link: link
Youtube Channel Name: PapersWeLove
Youtube Channel Link: https://www.youtube.com/@PapersWeLove
Douglas Creager, Pastry (Papers We Love BOS, July 2018)
Youtube Title: Health Care Workforce and Health Equity Inclusion Virtual Forum
Youtube Link: link
Youtube Channel Name: CMSHHSgov
Youtube Channel Link: https://www.youtube.com/@CMSHHSgov
Health Care Workforce and Health Equity Inclusion Virtual Forum
Youtube Title: Papers We Love SF - Johnathan Chiu and Bruce Spang
Youtube Link: link
Youtube Channel Name: PapersWeLove
Youtube Channel Link: https://www.youtube.com/@PapersWeLove
Papers We Love SF - Johnathan Chiu and Bruce Spang
Youtube Title: Trust, Media and Global Leadership | Strategies for Transformative Global Leadership
Youtube Link: link
Youtube Channel Name: World Academy of Art and Science
Youtube Channel Link: https://www.youtube.com/@worldacademyofart
Trust, Media and Global Leadership | Strategies for Transformative Global Leadership
Youtube Title: AI Weekly Update Overview - July 15th, 2021
Youtube Link: link
Youtube Channel Name: Connor Shorten
Youtube Channel Link: https://www.youtube.com/@connorshorten6311
AI Weekly Update Overview - July 15th, 2021
Youtube Title: Evaluating Large Language Models Trained on Code
Youtube Link: link
Youtube Channel Name: Connor Shorten
Youtube Channel Link: https://www.youtube.com/@connorshorten6311
Evaluating Large Language Models Trained on Code