Soft law for unintentional empathy: addressing the governance gap in emotion-recognition AI technologies
Sep 1, 2025·
,·
0 min read
Andrew McStay
Vian Bakir

Abstract
Despite regulatory efforts, there is a significant governance gap in managing emotion recognition AI technologies and those that emulate empathy. This paper asks: should international soft law mechanisms, such as ethical standards, complement hard law in addressing governance gaps in emotion recognition and empathy-emulating AI technologies? To argue that soft law can provide detailed guidance — particularly for research ethics committees and related boards advising on these technologies — the paper first explores how legal definitions of emotion recognition, especially in the EU AI Act, rest on reductive and physiognomic conceptions of emotion. It progresses to detail that systems may be designed to intentionally empathise with their users, but also that empathy may be unintentional, or effectively incidental to how these systems work. Approaches that are non-reductive and avoid labelling of emotion as conceived in the EU AI Act raise novel governance questions.
Type
Publication
Journal of Responsible Technology, 23, Article 100126

Authors
Professor of Technology and Society
Professor of Technology and Society at Bangor University and Director of
the Emotional AI Lab. My research focuses on emotional and empathic AI,
AI governance, and the risks of AI-enabled manipulation and scams. Author
of Automating Empathy (OUP, 2023) and Chair of IEEE 7014.1-2026.