Exploring Perceptions of Robot Facial Expressions in Service Interactions

Emma Kirjavainen, Celia Nieto Agraz, Ashley Colley, Heiko Mueller, Susanne Boll, Jonna Häkkilä

Tutkimustuotokset: Kirjoitus kirjassa/raportissa/konferenssijulkaisussaKonferenssiartikkeliTieteellinenvertaisarvioitu

Abstrakti

The problem of emotional alignment in human-robot interaction remains underexplored in real-time conversational contexts. In
this paper, we investigate the role of facial expressions in voice-based interactions using the Ameca humanoid robot, comparing
conditions where facial expressions are appropriate or inappropriate. In a user study (N = 28), participants interacted with the robot
in a simulated ticket-purchasing scenario. Our findings reveal that appropriate facial expressions significantly enhanced pragmatic
aspects of the interaction compared to inappropriate expressions, which, although sometimes confusing, could still provoke deeper
engagement. Participants noted that the interaction flow felt unnatural as facial expressions could only be rendered after the lip-synced
speech, rather than simultaneously. These results highlight the importance of emotional congruence in improving the naturalness of
human-robot interaction. This research informs the design and development of emotionally intelligent robots that can better align
their non-verbal cues with conversational contexts.
AlkuperäiskieliEi tiedossa
OtsikkoExploring Perceptions of Robot Facial Expressions in Service Interactions
KustantajaACM
DOI - pysyväislinkit
TilaJulkaistu - 7 lokak. 2025
OKM-julkaisutyyppiA4 Vertaisarvioitu artikkelil konferenssijulkaisussa

Viite tähän julkaisuun