July 2018 - Christopher McMurrough, of the University of Texas-Arlington, has developed an eye-tracking system that should improve upon the limitations of older models (e.g., need for a trained expert to calibrate the system, slow processing time). McMurrough’s system uses a forward-facing, head-mounted camera to generate a 3D map of the environment. This map is transmitted to a computer where together, with the input from the eye-tracking devices can accurately discern at what the user is looking. Inspired by his mother-in-law with amyotrophic lateral sclerosis (ALS), the system is intended to allow someone with a mobility disability to independently control things such as a wheelchair, robotic arm, computer, assistive technology, or conceivably any connected device in the environment. McMurrough has patented his system, and other applications are being considered, such as using it in a gaming environment and as an eye-health monitor [Source: Ben Coxworth, New Atlas]
Eye-Tracking Headset For Environmental Control
The contents of this website were developed under a grant from the National Institute on Disability, Independent Living, and Rehabilitation Research (NIDILRR grant number 90RE5025-01-00). NIDILRR is a Center within the Administration for Community Living (ACL), Department of Health and Human Services (HHS). The contents of this website do not necessarily represent the policy of NIDILRR, ACL, HHS, and you should not assume endorsement by the Federal Government.