gazesim/code/README.md
2016-03-09 19:52:35 +01:00

38 lines
2.4 KiB
Markdown

- Participants data is accessible under /BS/3D_Gaze_Tracking/archive00/participants/
- only the data for the following 14 participants is used in the study:
'p10', 'p16', 'p13', 'p24', 'p5', 'p14', 'p26', 'p12', 'p20', 'p7', 'p15', 'p11', 'p21', 'p25'
- Eye camera's intrinsic parameters are accessible under /BS/3D_Gaze_Tracking/archive00/eye_camera_images/
- Scene camera's intrinsic parameters are accessible under /BS/3D_Gaze_Tracking/archive00/scene_camera/
- geom.py contains the Pinhole camera model along with a couple of geometry related methods.
- minimize.py contains code on least square minimization using numpy.
- vector.py contains a fully python version of VPython's vector object used for vector processing.
- sim.py contains the main GazeSimulation object used for simulation and visualization. this object also
contains methods for handling real world data.
- parallax_analysis.py mainly uses GazeSimulation object to perform different experiments on simulation/real-
world data. (parallax_2D3D_3Cdepths.py is just a single experiment separated from this module)
- Scripts regarding real-world data processing including marker tracking, marker movement detection, frame
extraction, ... are inside recording package.
- recording/tracker.py uses a slightly modified version of aruco_test script to silently track markers in
the scene, log the output of ArUco, and compute 2D and 3D position of the center of marker given the ArUco output. (the modified aruco_test is included in recording package)
- recording/process_recordings.py contains the main code for detecting intervals in which marker is not
moving using sklearn's AgglomerativeClustering. the method performs well almost always but in a few cases manual annotation was required to get the correct output. output of this detection is depicted in marker_motion.png inside each recording's data files. a sample output for a test recording (16 points) is located under /BS/3D_Gaze_Tracking/work/marker_motion.png
- recording/util/check_pupil_positions.py is used during recording to ensure accurate pupil detection.
example usage:
python check_pupil_positions.py /BS/3D_Gaze_Tracking/archive00/participants/p16/2015_09_29/000/
- recording/util/SingleMarkerVisualizer/ contains the main Processing script for visualizing a moving marker
during the recording experiments. the code uses Processing version 3.