Arizona Stadium | University of Arizona
Arizona Stadium | University of Arizona
A team from the University of Arizona has made strides in enhancing eye-tracking technology by integrating a novel 3D imaging technique known as deflectometry with advanced computational methods. The study, led by Florian Willomitzer, associate professor of optical sciences, was published in Nature Communications.
The existing approaches to eye-tracking are limited in the directional information they can capture, according to Willomitzer. He explains, "Current eye-tracking methods can only capture directional information of the eyeball from a few sparse surface points, about a dozen at most." In contrast, their method uses deflectometry to gather data from over 40,000 surface points, potentially extending to millions from a single camera image.
Postdoctoral researcher Jiazhang Wang, the study's first author, emphasizes the significance of the new method in virtual reality applications: "More data points provide more information that can be potentially used to significantly increase the accuracy of the gaze direction estimation."
Deflectometry, traditionally used to measure reflective surfaces like telescope mirrors, has now been adapted to eye-tracking. Willomitzer's research team leverages deflectometry with computational methods mainly used in computer vision, aiming to explore its applications beyond industrial surfaces. "The unique combination of precise measurement techniques and advanced computation allows machines to 'see the unseen,'" said Willomitzer.
The team's experiments demonstrated an impressive accuracy range of 0.46 to 0.97 degrees in tracking human gaze direction, with the error reduced to around 0.1 degrees using an artificial eye model. Instead of relying on point light sources, the method uses a structured light pattern displayed on a screen to serve as the illumination source.
Jiazhang Wang adds: "Our computational reconstruction then uses this surface data together with known geometrical constraints about the eye's optical axis to accurately predict the gaze direction."
Previously, the team explored seamless integration with virtual reality and augmented reality systems. This approach could lessen complexity by potentially utilizing the visual content of headsets as reflective patterns.
The researchers aim to embed other 3D reconstruction techniques into the system and apply artificial intelligence to elevate the technology further. Oliver Cossairt, Tianfu Wang, and Bingjie Xu, who contributed to the paper, share ambitions to propel eye-tracking accuracy to new heights, with a pending patent and commercialization plans through Tech Launch Arizona.
"Our goal is to close in on the 0.1-degree accuracy levels obtained with the model eye experiments," Willomitzer stated. They hope the method will pioneer advancements in eye-tracking technologies relevant to fields such as neuroscience and psychology.
###