When Emotion AI Meets Strategic Users

  • Yifan Yu
  • , Wendao Xue
  • , Lin Jia*
  • , Yong Tan*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

When organizations adopt artificial intelligence (AI) to recognize individuals’ negative emotions and accordingly allocate limited resources, strategic users are incentivized to game the system by misrepresenting their emotions. The value of AI in automating such emotion-driven allocation may be undermined by gaming behavior, algorithmic noise in emotion detection, and the spillover effect of negative emotions. We develop a game-theoretical model to understand emotion AI adoption, particularly in customer care, and analyze the design of the associated allocation policies. We find that adopting emotion AI is valuable if the spillover effect of negative emotions is negligible compared with resource misallocation loss, regardless of algorithmic noise and gaming behavior. We also quantify the welfare impacts of emotion AI on the users, organization, and society. Notably, a stronger AI is not always socially desirable and regulation on emotion-driven allocation is needed. Finally, we characterize conditions under which leveraging the AI system is preferred to hiring human employees in emotion-driven allocation. We also explore the alternative application of using emotion AI to monitor strategic employees and compare it with hiring a human manager for monitoring. Intriguingly, algorithmic noise may increase the profit of AI monitoring. Our work provides implications for designing, adopting, and regulating emotion AI.

Original languageEnglish
Pages (from-to)627-645
Number of pages19
JournalManagement Science
Volume72
Issue number1
DOIs
Publication statusPublished - Aug 2025

Keywords

  • emotion AI
  • muddled information
  • signaling game
  • strategic users

Fingerprint

Dive into the research topics of 'When Emotion AI Meets Strategic Users'. Together they form a unique fingerprint.

Cite this