From 1d030e7ccf09e597f236f7d501512f2227a72d21 Mon Sep 17 00:00:00 2001 From: Andreas Bulling Date: Wed, 17 Apr 2019 11:16:23 +0200 Subject: [PATCH] Update README.md --- README.md | 12 +++++------- 1 file changed, 5 insertions(+), 7 deletions(-) diff --git a/README.md b/README.md index 727af7a..0d58cc3 100644 --- a/README.md +++ b/README.md @@ -4,8 +4,6 @@ Appearance-based gaze estimation methods that only require an off-the-shelf came To democratize their use in HCI, we present OpenGaze, the first software toolkit that is specifically developed for gaze interface designers. OpenGaze is open source and aims to implement state-of-the-art methods for camera-based gaze estimation and interaction. - - ## Functionality The toolkit is capable of performing the following gaze-related tasks: @@ -29,10 +27,10 @@ Perform personal calibration and remap the gaze target on a screen.

 

## Installation -[Unix Installation](https://github.molgen.mpg.de/perceptual/opengaze/wiki/Unix-Installation) +[Unix Installation](https://gitlab.hcics.simtech.uni-stuttgart.de/public-projects/opengaze/wikis/Unix-installation) ## Use -[Command line arguments](https://github.molgen.mpg.de/perceptual/opengaze/wiki/Command-line-arguments) +[Command line arguments](https://gitlab.hcics.simtech.uni-stuttgart.de/public-projects/opengaze/wikis/Command-line-arguments) ## Citation **If you use any of the resources provided on this page in any of your publications, please cite the following paper:** @@ -42,7 +40,7 @@ Evaluation of Appearance-Based Methods and Implications for Gaze-Based Applicati Xucong Zhang, Yusuke Sugano, Andreas Bulling Proc. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), 2019 ``` -[arXiv](https://arxiv.org/abs/1901.10906) +[Project page](https://www.perceptualui.org/publications/zhang19_chi/) @inproceedings{zhang19_chi,
title = {Evaluation of Appearance-Based Methods for Gaze-Based Applications},
@@ -55,11 +53,11 @@ abstract = {Appearance-based gaze estimation methods that only require an off-th ## License -The license agreement can be found in Copyright.txt +The license agreement can be found in [LICENSE](https://gitlab.hcics.simtech.uni-stuttgart.de/public-projects/opengaze/LICENSE). You have to respect boost, OpenFace and OpenCV licenses. -Furthermore, you have to respect the licenses of the datasets used for [model training](:https://github.molgen.mpg.de/perceptual/opengaze/wiki/Model-training). +Furthermore, you have to respect the licenses of the datasets used for [model training](https://gitlab.hcics.simtech.uni-stuttgart.de/public-projects/opengaze/wikis/Model-training). ## Code layout * caffe-layers: Our customized layers for the Caffe library.