Where people are looking is tightly coupled with what people are thinking about at that moment in time. The eyes dart around about three times every second, giving us a high resolution measure of what people are thinking and the depth of thinking. With the ability to build eye trackers for around £50/$100, it is no wonder gaze tracking has become such a popular tool.
But after running an experiment, you might find that the recorded locations don’t quite line up with the stimuli shown on screen. This can happen due to calibration errors, the eye tracker slipping, the participant moving or noise in the system. The last thing you want to do is throw out your data.
We offer an easy-to-implement MATLAB algorithm that attempts to minimize the error in your data. It does so by solving an optimization problem: to discover the best linear transformation of the data that will align fixation positions with the stimuli on screen. We find that the algorithm is quite robust to random fixations and looks towards nothing, and works in many settings.
Please be aware that this code must be employed judiciously. There are certain select circumstances in which it could give misleading results, as we outline in the paper (Vadillo, Street, Beesley & Shanks, 2015).
function C = fixationRecalibration(F, stimCoords)
T = fminsearch(@avgDistanceToClosestFixation, eye(2)); C = (T*F);
function avgDistance = avgDistanceToClosestFixation(transformation)
coords = (transformation*F);
distClosest = zeros(1, size(coords, 2));
for fixNum = 1:size(coords, 2)
dist = zeros(1, size(stimCoords,2));
for stimNum = 1:size(stimCoords,2);
dist(stimNum) = norm(coords(:,fixNum)-stimCoords(:,stimNum));
distClosest(fixNum) = min(dist);
avgDistance = mean(distClosest);
Vadillo, M. A., Street, C. N. H., Beesley, T., & Shanks, D. R. (2015). A simple algorithm for the offline recalibration of eye tracking data through best-fitting linear transformation. Behavior Research Methods, 47(4), 1365-1376. [pdf]