I’ve been working on a better way to visualize data generated by my eye tracking camera. The typical approach smashes data into a single image. However, I’m really interested in seeing how eye gaze moves over time, meaning that aggregating loses the critical aspects.
The gif above shows the tool I’m currently developing. It uses three.js to place the recorded video and image slide into a 3d space. I then display *every* person’s gaze as a point, and update it in real-time. This also allows me to include pupil dilation, which is a key marker of cognitive load.
It still needs work, but is a pretty cool way of seeing how people reacted during the one-on-one experimental sessions.