Detecting Bids for Eye Contact Using a Wearable Camera

We propose a system for detecting bids for eye contact directed from a child to an adult who is wearing a point-of-view camera. The camera captures an egocentric view of the child-adult interaction from the adult’s perspective. We detect and analyze the child’s face in the egocentric video in order to automatically identify moments in which the child is trying to make eye contact with the adult. We present a learning-based method that couples a pose-dependent appearance model with a temporal Conditional Random Field (CRF). We present encouraging findings from an experimental evaluation using a newly collected dataset of 12 children. Our method outperforms state-of-the-art approaches and enables measuring gaze behavior in naturalistic social interactions.

Zhefan Ye, Yin Li, Yun Liu, Chanel Bridges, Agata Rozga, and James M. Rehg.
Detecting Bids for Eye Contact Using a Wearable Camera.
11th IEEE International Conference on Automatic Face and Gesture Recognition (FG2015) Best Student Paper Award

Paper  |  Presentation ( coming soon)  |  BibTeX (coming soon)


Multimodal Dyadic Behavior Dataset (MMDB)

Coming Soon

The authors would like to thank OMRON Corp. for providing OKAO Vision software. Portions of this work were supported in part by NSF Expedition Award number 1029679, the Intel Science and Technology Center in Pervasive Computing, and the Simons Foundation Autism Research Initiative.


The documents contained in these directories are included by the contributing authors as a means to ensure timely dissemination of scholarly and technical work on a non-commercial basis. Copyright and all rights therein are maintained by the authors or by other copyright holders, notwithstanding that they have offered their works here electronically. It is understood that all persons copying this information will adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without explicit permission of the copyright holder.