Official code for "HAGI: Head-Assisted Gaze Imputation for Mobile Eye Trackers" published at UIST'25 and its extended version "HAGI++: Head-Assisted Gaze Imputation and Generation "
Find a file
Chuhan Jiao 1d1fd3ce53 fix eval
2025-12-11 14:24:35 +01:00
assets add gif 2025-12-11 14:11:06 +01:00
config Initial release 2025-12-11 13:58:46 +01:00
save fix eval 2025-12-11 14:24:35 +01:00
dataset_nymeria.py Initial release 2025-12-11 13:58:46 +01:00
diff_models.py Initial release 2025-12-11 13:58:46 +01:00
environment.yaml Initial release 2025-12-11 13:58:46 +01:00
evaluate.py fix eval 2025-12-11 14:24:35 +01:00
LICENSE Initial commit 2025-09-22 10:50:41 +02:00
main_model.py Initial release 2025-12-11 13:58:46 +01:00
preprocess_aria.py Initial release 2025-12-11 13:58:46 +01:00
README.md correct README 2025-12-11 14:12:39 +01:00
train.py Initial release 2025-12-11 13:58:46 +01:00
train_wrist.py Initial release 2025-12-11 13:58:46 +01:00
utils.py Initial release 2025-12-11 13:58:46 +01:00

HAGI++: Head-Assisted Gaze Imputation and Generation

[Arxiv]

HAGI: Head-Assisted Gaze Imputation for Mobile Eye Trackers

Chuhan Jiao,   Zhiming Hu,   Andreas Bulling
ACM UIST 2025, Busan, Republic of Korea
[Project] [Paper]


HAGI++ Overview Teaser

Demo Preview

Updates

  • Dec. 11, 2025: Initial release.

Getting started

  1. Clone the repository.
    git clone https://git.hcics.simtech.uni-stuttgart.de/public-projects/HAGI.git
    
  2. Install general dependencies.
    cd HAGI
    conda create -f environment.yaml -n hagi
    
  3. Download the preprocessed datasets. Download the preprocessed datasets from this link (password: `GyBQ1r7'{). After downloading, extract the folders and place them inside the datasets directory.

Training

  1. HAGI Training.
  python train.py --loss_type partcial_mix --fourier --xattn
  1. HAGI++ Training. Use head rotation and translation.
  python train.py --config base_hagi++.yaml --loss_type partcial_noise --fourier --dt

Use head rotation only (still achieve competitive performance).

  python train.py --config base_hagi++.yaml --loss_type partcial_noise --head_type Rotation --fourier --dt

For gaze generation (100% missing values), although previous models also works, training one without obervation masks can achieve better performance.

  python train.py --config base_hagi++.yaml --loss_type partcial_noise --fourier --dt --missing_ratio 1 --generation

To train 3-point HAGI++,

  python train_wrist.py --config base_hagi++.yaml --loss_type partcial_noise --fourier --generation

Inference

We provide several pretrained checkpoints. Those trained with only head movements data are under save/head and those trained on Nymeria subset that contains wrist motions are under save/head_wrists. Please change the config file, modelfolder and dataset name accordingly.

  python evaluate.py --config base_hagi++.yaml --modelfolder head/hagi++_imputation --dataset nymeria

Run HAGI/HAGI++ on your own data

We provide an example function in preprocess_aria.py to process data captured by Meta Project Aria Glasses (Gen 1) to obtain gaze and head movements. To run this, projectaria_tools package is required. The function is tested with projectaria-tools 1.5.5.

Citation

If you find our code useful or use it in your own projects, please cite our papers:

@inproceedings{jiao25_uist,
  title = {HAGI: Head-Assisted Gaze Imputation for Mobile Eye Trackers},
  author = {Jiao, Chuhan and Hu, Zhiming and Bulling, Andreas},
  year = {2025},
  pages = {1--14},
  booktitle = {Proc. ACM Symposium on User Interface Software and Technology (UIST)},
  doi = {10.1145/3746059.3747749}
}
@article{jiao2025hagi++,
  title={HAGI++: Head-Assisted Gaze Imputation and Generation},
  author={Jiao, Chuhan and Hu, Zhiming and Bulling, Andreas},
  journal={arXiv preprint arXiv:2511.02468},
  year={2025}
}

Acknowledgements

We acknowledge that HAGI and HAGI++ are built upon CSDI. We have also incorporated and adapted components from Motion Diffuse, EgoAllo, and SP-EyeGAN. We thank the developers of these projects for making their code available to the community.