Smartphone app that could help screen neurological disease at home
A team of researchers has developed a smartphone app that could allow people to screen for Alzheimer's disease, ADHD and other neurological diseases and disorders, by recording closeups of their eyes
A team of researchers has developed a smartphone app that could allow people to screen for Alzheimer's disease, attention-deficit hyperactivity disorder (ADHD) and other neurological diseases and disorders, by recording closeups of their eyes.
The app uses a near-infrared camera, which is built into newer smartphones for facial recognition, along with a regular selfie camera to track how a person's pupil changes in size.
These pupil measurements could be used to assess a person's cognitive condition, indicated the study, to be presented at the ACM Computer Human Interaction Conference on Human Factors in Computing Systems (CHI 2022).
"While there is still a lot of work to be done, I am excited about the potential for using this technology to bring neurological screening out of clinical lab settings and into homes," said researcher Colin Barry from the University of California, San Diego.
Pupil size can provide information about a person's neurological functions, recent research has shown. For example, pupil size increases when a person performs a difficult cognitive task or hears an unexpected sound.
The app uses a smartphone's near-infrared camera to detect a person's pupil. In the near-infrared spectrum, the pupil can be easily differentiated from the iris, even in eyes with darker iris colours.
This enables the app to calculate pupil size with sub-millimeter accuracy across various eye colours.
The app also uses a colour picture taken by the smartphone's selfie camera to capture the stereoscopic distance between the smartphone and the user.
The app then uses this distance to convert the pupil size from the near-infrared image into millimeter units.
The researchers worked with older adult participants to design a simple app interface that allows users to self-administer pupil response tests.
This interface included voice commands, image-based instructions, and a cheap, plastic scope to direct the user to place their eye within the view of the smartphone camera.