Navigation auf uzh.ch
We are happy to share that our new paper titled "Improving cognitive-state analysis from eye gaze with synthetic eye-movement data" has been accepted to Computer & Graphics.
Abstract
Eye movements can be used to analyze a viewer’s cognitive capacities or mental state. Neural networks that process the raw eye-tracking signal can outperform methods that operate on scan paths preprocessed into fixations and saccades. However, the scarcity of such data poses a major challenge. We therefore develop SP-EyeGAN, a neural network that generates synthetic raw eye-tracking data. SP-EyeGAN consists of Generative Adversarial Networks; it produces a sequence of gaze angles indistinguishable from human ocular micro- and macro-movements. We explore the use of these synthetic eye movements for pre-training neural networks using contrastive learning. We find that pre-training on synthetic data does not help for biometric identification, while results are inconclusive for the detection of ADHD and gender classification. However, for the eye movement-based assessment of higher-level cognitive skills such general reading comprehension, text comprehension, and the distinction of native from non-native readers, pre-training on synthetic eye-gaze data improves the models’ performance and even advances the state-of-the-art for reading comprehension. The SP-EyeGAN model, pre-trained on GazeBase, along with the code for developing your own raw eye-tracking machine learning model with contrastive learning, is available at https://github.com/aeye-lab/sp-eyegan.
Improving Cognitive-State Analysis from Eye Gaze with Synthetic Eye-Movement Data, Computers & Graphics
Paul Prasse, David R. Reich, Silvia Makowski, Tobias Scheffer, Lena A. Jäger
[http | bib ]