Eye tracking for studying educational technology

In Julie Libarkin’s class this week, we are discussing the use of eye tracking to evaluate educational technology. The papers we read for class focused on how images in Powerpoint and educational videos are interpreted by students/viewers. Eye tracking data is represented as a Cartesian coordinate (of gaze location on a 2-dimensional image) and a timestamp. The data can be interpreted in a variety of ways, including: (a) the total amount of time spent focused on a region (or look zone) of an image, (b) the frequency that a user returns to a specific look zone of the image, (c) the number of zones that a viewer fixates on, (d) the first zone that a viewer attends to, and (e) the chronological timeline of look zones (where a user looks first, second, third, etc).

One example of using eye tracking data to improve visual impact of an image is in the use of intensity metrics. Heat maps, or kernel density plots, can be used to identify which visual compositions are more effective for viewer attentiveness. For example, see the following heat-mapped ads:

When viewers’ gazes are mapped (based on the time they spend focusing on each part of an image), it is clear that when the woman looks at the shampoo bottle, that viewers spend more time focusing on the branded shampoo bottle. The Sunsilk advertisement developers can then infer that the second ad is the more effective ad for enticing viewers attention to the brand.

These methods can be similarly used in educational technology. Now I’m thinking about how we could do so with the IPython Notebook. Since it is a scrollable medium, eye tracking would be more difficult to interpret (because it would be multi-frame like a video), but it is definitely still do-able. Julie has done research with eye tracking on scrollable websites, which actually would be very similar. Possible things I think we could measure:

  • How much time students spend on different types of cells, where cells are designated as “executable code,” “text,” “image,” “video,” etc
  • Timelines of how students proceed through Notebooks
  • How voiceover narration guides where a student looks in a Notebook

Those are just the things coming to mind right now. Off to get ready for PyCon!


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s