Image
Vidya Somashekarappa
Photo: Monica Havström
Breadcrumb

Vidya Somashekarappa: Look on my thesis, ye mighty – Gaze interaction and social robotics

Culture and languages

Dissertation for Ph.D. in Computational Linguistics at the Faculty of Humanities, Department of Philosophy, Linguistics and Theory of Science. You can follow via Zoom if you wish. Welcome!

Dissertation
Date
25 Apr 2024
Time
13:15 - 18:00
Location
Room J330, Humanisten, Renströmsgatan 6

Organizer
Department of Philosophy, Linguistics and Theory of Science

Respondent:
Vidya Thimmanahalli Somashekarappa, Institutionen för filosofi, lingvistik och vetenskapsteori

Thesis title:
"Look on my thesis, ye mighty" – Gaze interaction and social robotics

Examining committee:
Professor Danielle Matthews, University of Sheffield
Associate Professor Alexandrs Berdicevskis, Göteborgs universitet
Assistant Professor Emilia Barakova, Eindhoven University of Technology

Substitute if member  in the committee will be missing:
Docent Ellen Breitholtz, Göteborgs universitet

Opponent:
Maîtresse de conference/senior lecturer Dominique Knutsen, Université de Lille

Chair:
Docent Eva-Marie Karin Bloom Ström

Abstract

Gaze, a significant non verbal social signal, conveys attentional cues and provides insight into others' intentions and future actions. The thesis examines the intricate aspects of gaze in human human dyadic interaction, aiming to extract insights applicable to  enhance multimodal human agent dialogue. By annotating various types of gaze behavior alongside speech, the thesis explores the meaning of temporal patterns in gaze cues and their correlations.

On the basis of leveraging a multimodal corpus of dyadic taste testing interactions, the thesis further investigates the relationship between laughter, pragmatic functions, and accompanying gaze patterns. The findings reveal that laughter serves different pragmatic functions in association with distinct gaze patterns, underscoring the importance of laughter and gaze in multimodal meaning construction and coordination, relevant for designing human like conversational agents.

The thesis also proposes a novel approach to estimate gaze using a neural network architecture, considering dynamic patterns of real world gaze behavior in natural interaction. The framework aims to facilitate responsive and intuitive interaction by enabling robots/avatars to communicate with humans using natural multimodal dialogue. This framework performs unified gaze detection, gaze object prediction, and object landmark heatmap generation.

Evaluation on annotated datasets demonstrates superior performance compared to previous methods, with promising implications for implementing a contextua lized gaze tracking behavior in robotic interaction. Finally, the thesis investigates the impact of different gaze patterns from a robot on Human Robot Interaction (HRI).

The results suggest that manipulating robot gaze based on human human interaction pat terns positively influences user perceptions, enhancing anthropo anthropo-morphism and engagement.