Eye contact in video conference via fusion of time-of-flight depth sensor and stereo

Jiejie Zhu, Ruigang Yang, Xueqing Xiang

Research output: Contribution to journalArticlepeer-review

20 Scopus citations


In video conferences, one user can either look at the remote image of the other captured by a video camera, or look at the camera that is capturing her/him, but not two tasks at the same time. The lack of eye contact caused by this misalignment substantially reduces the effectiveness of communication with an unpleasant feeling of disconnectedness. We propose an approach to bring eye contact back while user looks at the remote image of the other by a novel system composed by a Time-of-Flight depth sensor and traditional stereo. The key success of this system is to faithfully recover scene's depth. In this 2. 5D space, the controlling of the user's eye-gaze becomes relatively easier. To evaluate the performance of the system, we conducted two user studies. One focuses on subjects have been trained to be familiar to eye gaze displayed in images; another is blind evaluation that subjects have no prior knowledge about eye gaze. Both evaluations show that the system can bring desktop participants closer to each other.

Original languageEnglish
Article number5
Pages (from-to)1-10
Number of pages10
Journal3D Research
Issue number3
StatePublished - 2011


  • Eye-gaze
  • Stereo
  • Time-of-Flight

ASJC Scopus subject areas

  • Software
  • Electrical and Electronic Engineering


Dive into the research topics of 'Eye contact in video conference via fusion of time-of-flight depth sensor and stereo'. Together they form a unique fingerprint.

Cite this