56 lines
3 KiB
Markdown
56 lines
3 KiB
Markdown
|
# OpenGaze: Open Source Toolkit for Camera-Based Gaze Estimation and Interaction
|
||
|
|
||
|
<!--The current demo video includes clips from Friends, which may violate the copyright. Although people think 28 seconds could be a boundary:https://productforums.google.com/forum/#!topic/youtube/rQhkI20Rm8k, there is no golden rule for it: https://www.youtube.com/yt/about/copyright/fair-use/#yt-copyright-protection-->
|
||
|
<!--//[![Demo](https://img.youtube.com/vi/OORxOdu8USQ/0.jpg)](https://youtu.be/OORxOdu8USQ "OpenGaze Friends Demo")-->
|
||
|
|
||
|
Appearance-based gaze estimation methods that only require an off-the-shelf camera have significantly improved and promise a wide range of new applications in gaze-based interaction and attentive user interfaces. However, these methods are not yet widely used in the human-computer interaction (HCI) community.
|
||
|
|
||
|
To democratize their use in HCI, we present OpenGaze, the first software toolkit that is specifically developed for gaze interface designers. OpenGaze is open source and aims to implement state-of-the-art methods for camera-based gaze estimation and interaction.
|
||
|
|
||
|
<img src="https://github.molgen.mpg.de/perceptual/opengaze/blob/master/imgs/logo_mpiinf.png" height="80"/><img src="https://github.molgen.mpg.de/perceptual/opengaze/blob/master/imgs/logo_pui.png" height="80"><img src="https://github.molgen.mpg.de/perceptual/opengaze/blob/master/imgs/logo_osaka-u.png" height="80">
|
||
|
|
||
|
## Functionality
|
||
|
|
||
|
The toolkit is capable of performing the following gaze-related tasks:
|
||
|
|
||
|
* **Gaze Estimation**
|
||
|
Show estimated gaze on the screen given screen-camera relationship.
|
||
|
|
||
|
[![Demo](https://img.youtube.com/vi/R1vb7mV3y_M/0.jpg)](https://youtu.be/R1vb7mV3y_M "Gaze visualization demo")
|
||
|
<p> </p>
|
||
|
|
||
|
* **Gaze Visualization**
|
||
|
Show gaze direction inital from the center of faces in the input image.
|
||
|
|
||
|
[![Demo](https://img.youtube.com/vi/8yMTvvr0rRU/0.jpg)](https://youtu.be/8yMTvvr0rRU "Gaze visualization demo")
|
||
|
<p> </p>
|
||
|
|
||
|
* **Personal Calibration**
|
||
|
Perform personal calibration and remapped the gaze target on the screen.
|
||
|
|
||
|
[![Demo](https://img.youtube.com/vi/ntBv1wcNGAo/0.jpg)](https://youtu.be/ntBv1wcNGAo "Gaze visualization demo")
|
||
|
<p> </p>
|
||
|
|
||
|
## Installation
|
||
|
[Unix Installation](https://github.molgen.mpg.de/perceptual/opengaze/wiki/Unix-Installation)
|
||
|
|
||
|
## Use
|
||
|
[Command line arguments](https://github.molgen.mpg.de/perceptual/opengaze/wiki/Command-line-arguments)
|
||
|
|
||
|
## Citation
|
||
|
If you use any of the resources provided on this page in any of your publications, please cite the following paper:
|
||
|
|
||
|
**Evaluation of Appearance-Based Methods and Implications for Gaze-Based Applications?** <br/>
|
||
|
Xucong Zhang, Yusuke Sugano, Andreas Bulling<br/>
|
||
|
Proc. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), 2019<br/>
|
||
|
|
||
|
BibTex, PDF
|
||
|
|
||
|
## License
|
||
|
|
||
|
The license agreement can be found in Copyright.txt
|
||
|
|
||
|
You have to respect boost, OpenFace and OpenCV licenses.
|
||
|
|
||
|
Furthermore, you have to respect the licenses of the datasets used for [model training](:https://github.molgen.mpg.de/perceptual/opengaze/wiki/Model-training).
|