Welcome to GazeHub@Phi-ai Lab.
This work helps you quickly build up your own gaze estimation system with standard tools and data, and also compare it with different existing methods.
In this page, you can find:
- Codes of state-of-the-art gaze estimation methods.
- Processed datasets of gaze estimation.
- Tools for the practice of gaze estimation.
We invite you to read our survey "Appearance-based Gaze Estimation With Deep Learning: A Review and Benchmark". If you use the code or data provided in this website, please cite our survey:
@article{Cheng2021Survey,
title={Appearance-based Gaze Estimation With Deep Learning: A Review and Benchmark},
author={Yihua Cheng and Haofei Wang and Yiwei Bao and Feng Lu},
journal={arXiv preprint arXiv:2104.12668},
year={2021}
}
If you have any questions, please contact Prof. Feng Lu (lufeng@buaa.edu.cn) or Yihua Cheng (yihua_c@buaa.edu.cn).
Benchmarks
Our survey builds a comprehensive benchmark. Please refer to our paper for more details.
The benchmark contains various kinds of gaze estimation methods. Note that different methods have different outputs (2D gaze positions or 3D gaze directions). Therefore, we provide codes for post-processing (2D->3D and 3D->2D) to enable fair comparisons among different methods. We also publish standard tools for data rectification and gaze origin conversion.
You are also welcomed to report new results of new methods. We will check and add them to our benchmark.
Benchmark of 3D Gaze Estimation
Benchmark of 2D Gaze Estimation
License
This work is under the license of CC BY-NC-SA 4.0 license with additional conditions and terms. Please refer to the completed license file.
Link to Phi-ai Lab.
We also invite you to visit our lab.
Reference
- MPIIGaze: Real-World Dataset and Deep Appearance-Based Gaze Estimation
- EYEDIAP Database: Data Description and Gaze Tracking Evaluation Benchmarks
- Learning-by-Synthesis for Appearance-based 3D Gaze Estimation
- Gaze360: Physically Unconstrained Gaze Estimation in the Wild
- ETH-XGaze: A Large Scale Dataset for Gaze Estimation under Extreme Head Pose and Gaze Variation
- Appearance-Based Gaze Estimation in the Wild
- Appearance-Based Gaze Estimation Using Dilated-Convolutions
- RT-GENE: Real-Time Eye Gaze Estimation in Natural Environments
- It’s written all over your face: Full-face appearance-based gaze estimation
- A Coarse-to-fine Adaptive Network for Appearance-based Gaze Estimation
- Eye Tracking for Everyone
- Adaptive Feature Fusion Network for Gaze Tracking in Mobile Tablets
- On-Device Few-Shot Personalization for Real-Time Gaze Estimation
- A Generalized and Robust Method Towards Practical Gaze Estimation on Smart Phone