A new computational source can accurately pick up the subtle signs of Parkinson’s disease from a surprising input data: selfies.
The computer vision software analyzes the subtle movement of facial muscles to predict whether the individual in the photograph is showing the early signs of Parkinson’s, a chronic and progressive neurological disorder that causes tremors, stiffness, and balance issues. In data presented in Nature Digital Medicine, the technology’s innovators say that the algorithm is as reliable as costly wearable devices that also monitor Parkinson’s red flags.
Lead researcher Ehsan Hoque said that Parkinson’s is the fastest-growing neurological disorder. “What if, with people’s permission, we could analyze those selfies and give them a referral in case they are showing early signs?” asked Hoque.
Hoque and colleagues also developed a virtual diagnostic test that neurologists could administer via webcam. The test involves asking the patient to smile, read a written sentence aloud, touch their index finger to their thumb as quickly as possible, and make a series of facial expressions.
The diagnostic platform then generates a percentage likelihood that the patient shows early symptoms of Parkinson’s or related neurological disorders.
While artificial intelligence clearly offers significant benefits over currently-available diagnostic approaches, Hoque says many ethical and technological considerations still need to be ironed out before integrating the platform into healthcare settings.
“The challenge is not only validating the accuracy of our algorithms but also translating the raw machine-generated output in a language that is humane, assuring, understandable, and empowering to the patients.”