A while ago, we at DataArt considered buying a modern eye-tracking device to better assess usability of the user interfaces we built for our clients. After reviewing a variety of devices and testing one of them we decided to build a custom eye tracking device, as we were driven by engineering interest and the desire to minimize the costs of purchasing such a device.
The first steps to making our own gear we made while taking part in a test at a special laboratory with two adjoining rooms separated by a "one-sided" mirror. The first room was intended for a user and a tester that guided the user through some scenarios; in the second room were located observers. The user’s room was equipped with a monitor, a webcam, a Tobii X120 eye tracker, and a microphone.
After testing a number of participants we created a report containing:
Video – that displays the user's cursor movement on the desktop, the person’s gaze direction, and the webcam recording of the user.
Gaze plots - displaying the sequence of movements, the order and length of each user’s gaze.
Clusters –polygons that show areas with the highest concentration of focus, also it shows the percentage of respondents who were interested in these areas.
Building a Custom Device
Having checked the results and counted the costs (about 30,000 euros), we decided to make our own device, an incredible learning experience that was well worth the time. To build the gadget we needed a camera, lens, and two infrared emitters (pic.1). We got it all for around $600 which is much less than the out of the box solution.
We found a lot of open-source solutions for disabled people – for example, special programs to control a computer using head gestures. But in fact, their accuracy was low, and the user has to swing their head and could easily get tired. Finally, we found the solution that showed the greatest accuracy, Gaze Tracker, an open-source development from 2010. It was able to interact with the device we had made.
We also found a program, OGAMA (OpenGazeAndMouseAnalyzer), written in C#.NET, which records and analyzes eye and mouse tracking data from a slideshow of eye-tracking experiments, and integrated perfectly with our equipment.
So the next steps we must take remains: to collect details for the finished device, try out OGAMA and, maybe, rewrite it for our specific needs. And, finally, use the device for usability studies. I’m sure that this experience will be worth sharing here.