# OpenGaze: Open Source Toolkit for Camera-Based Gaze Estimation and Interaction Appearance-based gaze estimation methods that only require an off-the-shelf camera have significantly improved and promise a wide range of new applications in gaze-based interaction and attentive user interfaces. However, these methods are not yet widely used in the human-computer interaction (HCI) community. To democratize their use in HCI, we present OpenGaze, the first software toolkit that is specifically developed for gaze interface designers. OpenGaze is open source and aims to implement state-of-the-art methods for camera-based gaze estimation and interaction. ## Functionality The toolkit is capable of performing the following gaze-related tasks: * **Gaze Estimation** Estimate and show a user's gaze on a screen in real time. [![Demo](https://img.youtube.com/vi/aenp4ZWjBZo/0.jpg)](https://youtu.be/aenp4ZWjBZo "Gaze Estimation")
* **Gaze Visualization** Plot gaze direction in images. [![Demo](https://img.youtube.com/vi/9Lujg3beiYI/0.jpg)](https://youtu.be/9Lujg3beiYI "Gaze Visualization")
* **Personal Calibration** Perform personal calibration and remap the gaze target on a screen. [![Demo](https://img.youtube.com/vi/BjhZcRw4N-w/0.jpg)](https://youtu.be/BjhZcRw4N-w "Personal Calibration")
## Installation [Unix Installation](https://github.molgen.mpg.de/perceptual/opengaze/wiki/Unix-Installation) ## Use [Command line arguments](https://github.molgen.mpg.de/perceptual/opengaze/wiki/Command-line-arguments) ## Citation **If you use any of the resources provided on this page in any of your publications, please cite the following paper:** ``` Evaluation of Appearance-Based Methods and Implications for Gaze-Based Applications Xucong Zhang, Yusuke Sugano, Andreas Bulling Proc. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), 2019 ``` @inproceedings{zhang19_chi,