Institut für Informatik | Sitemap | LMU-Portal
English
  • Startseite
  • Studieninteressierte
  • Studierende
  • Lehrveranstaltungen
  • Forschung
    • Publikationen
    • Partner
  • Personen
  • Kontakt
  • Besucher
  • Jobs
  • FAQ
  • Intern

Publikations-Information

[Download PDF]
Download
Matthias Schmidmaier, Jonathan Rupp, Cedrik Harrich, Sven Mayer
Using Nonverbal Cues in Empathic Multi-Modal LLM-Driven Chatbots for Mental Health Support
Proc. ACM Hum.-Comput. Interact., Association for Computing Machinery, 2025-09-01 (bib)
  Despite their popularity in providing digital mental health support, mobile conversational agents primarily rely on verbal input, which limits their ability to respond to emotional expressions. We therefore envision using the sensory equipment of today\'s devices to increase the nonverbal, empathic capabilities of chatbots. We initially validated that multi-modal LLMs (MLLM) can infer emotional expressions from facial expressions with high accuracy. In a user study (N=200), we then investigated the effects of such multi-modal input on response generation and perceived system empathy in emotional support scenarios. We found significant effects on cognitive and affective dimensions of linguistic expression in system responses, yet no significant increases in perceived empathy. Our research demonstrates the general potential of using nonverbal context to adapt LLM response behavior, providing input for future research on augmented interaction in empathic MLLM-based systems.
Nach oben
Impressum – Datenschutz – Kontakt  |  Letzte Änderung am 05.02.2007 von Richard Atterer (rev 1481)