When we think of a medical examination, we usually picture a chilly gown, a sterile exam table, and a doctor using instruments developed well over a century ago to poke, probe, and magnify. But thanks to the burgeoning field of artificial intelligence, a computer’s “eye” is now able to make diagnoses in seconds.
According to WIRED, an influx of phone apps and simple interfaces are popping up to aid physicians in complex medical cases where symptoms don’t point to obvious conclusions. Face2Gene, an app that sprung out of work identifying facial features for tagging purposes on Facebook, looks at subtle variations in the face—eye symmetry, ear position—and calculates what ailments might match that phenotype. (Face2Gene's developers say the program can now identify nearly half of the 8000 genetic syndromes known to exist.)
Another application under development, the RightEye GeoPref Autism Test, tests eye movement using infrared sensors while a child observes video footage. Developers believe that the test can help assess symptoms of autism in infants as young as 12 months. Winterlight Labs's “deep learning” machine, meanwhile, picks up subtle signs of cognitive impairment in speech, recognizing symptoms of Alzheimer’s before it's too late to treat it.
While it might be some time before these resources are commonplace among specialists, they do point to a future where a diagnosis for hard-to-detect disorders and diseases may be faster and more accurate than ever before.