Fixation-Image Charts
Proceedings of the 2016 Symposium on Eye Tracking Research and Applications
K.Kurzhals
VISUS, University of Stuttgart, Germany.
Abstract
We facilitate the comparative visual analysis of eye tracking data from multiple participants with a visualization that represents the temporal changes of viewing behavior. Common approaches to visually analyze eye tracking data either occlude or ignore the underlying visual stimulus, impairing the interpretation of displayed measures. We introduce fixation-image charts: a new technique to display the temporal changes of fixations in the context of the stimulus without visual overlap between participants. Fixation durations, the distance and direction of saccades between consecutive fixations, as well as the stimulus context can be interpreted in one visual representation. Our technique is not limited to static stimuli, but can be applied to dynamic stimuli as well. Using fixation metrics and the visual similarity of stimulus regions, we complement our visualization technique with an interactive filter concept that allows for the identification of interesting fixation sequences without the time-consuming annotation of areas of interest. We demonstrate how our technique can be applied to different types of stimuli to perform a range of analysis tasks; and discuss advantages and shortcomings derived from a preliminary user study.
Video
Material
- Source Code [.tar]
ISeeCube - Visual Analysis of Gaze Data for Video
Proceedings of the 2014 Symposium on Eye Tracking Research and Applications
K.Kurzhals
VISUS, University of Stuttgart, Germany.
Abstract
We introduce a new design for the visual analysis of eye tracking data recorded from dynamic stimuli such as video. ISeeCube includes multiple coordinated views to support different aspects of various analysis tasks. It combines methods for the spatiotemporal analysis of gaze data recorded from unlabeled videos as well as
the possibility to annotate and investigate dynamic Areas of Interest (AOIs). A static overview of the complete data set is provided by a space-time cube visualization that shows gaze points with densitybased color mapping and spatiotemporal clustering of the data. A timeline visualization supports the analysis of dynamic AOIs and the viewers??? attention on them. Individual and similar viewing patterns of different viewers can be clustered by their Levenshtein distance, an attention map, or the transitions between AOIs. With the provided visual analytics techniques, the exploration of eye tracking data recorded from several viewers is supported for a wide range of various analysis tasks.
Video
Material
- Paper [PDF]
- BibTeX [Coming soon]
Case Study Images
Acknowledgements
This work was funded by the German Research Foundation (DFG) as
part of the Priority Program ???Scalable Visual Analytics??? (SPP1335).
Space-Time Visual Analytics of Eye-Tracking Data for Dynamic Stimuli
IEEE Transactions on Visualization and Computer Graphics, 19 (12), 2129-2138, 2013.
K.Kurzhals
VISUS, University of Stuttgart, Germany.
Abstract
We introduce a visual analytics method to analyze eye movement data recorded for dynamic stimuli such as video or animated graphics. The focus lies on the analysis of data of several viewers to identify trends in the general viewing behavior, including time sequences of attentional synchrony and objects with strong attentional focus. By using a space-time cube visualization in combination with clustering, the dynamic stimuli and associated eye gazes can be analyzed in a static 3D representation. Shotbased, spatiotemporal clustering of the data generates potential areas of interest that can be filtered interactively. We also facilitate data drill-down: the gaze points are shown with density-based color mapping and individual scan paths as lines in the space-time cube. The analytical process is supported by multiple coordinated views that allow the user to focus on different aspects of spatial and temporal information in eye gaze data. Common eye-tracking visualization techniques are extended to incorporate the spatiotemporal characteristics of the data. For example, heat maps are extended to motion-compensated heat maps and trajectories of scan paths are included in the space-time visualization. Our visual analytics approach is assessed in a qualitative users study with expert users, which showed the usefulness of the approach and uncovered that the experts applied different analysis strategies supported by the system.
Acknowledgements
This work was funded by the German Research Foundation (DFG) as
part of the SFB 716 / D.5 at University of Stuttgart.