We may soon live in a world where sending a selfie to your doctor can help with diagnosing heart disease—at least according to the last research published in the European Heart Journal.
"To our knowledge, this is the first work demonstrating that artificial intelligence can be used to analyse faces to detect heart disease. It is a step towards the development of a deep learning-based tool that could be used to assess the risk of heart disease, either in outpatient clinics or by means of patients taking 'selfies' to perform their own screening. This could guide further diagnostic testing or a clinical visit," said Professor Zhe Zheng, who led the research
The study is the first to demonstrate that a deep learning computer algorithm can determine coronary heart disease (CAD) just by looking at 4 photographs of a person’s face. Even though the algorithm needs further work to be implemented—it holds the potential as the next screening tool of heart disease in high risk populations.
Learn more about coronary heart disease (CAD):
"Our ultimate goal is to develop a self-reported application for high risk communities to assess heart disease risk in advance of visiting a clinic. This could be a cheap, simple and effective of identifying patients who need further investigation. However, the algorithm requires further refinement and external validation in other populations and ethnicities."
A selfie can provide useful information regarding the facial fetures of heart disease with some being thinning or grey hair, wrinkles, ear lobe crease, xanthelasmata (small, yellow deposits of cholesterol underneath the skin, usually around the eyelids) and arcus corneae (fat and cholesterol deposits that appear as a hazy white, grey or blue opaque ring in the outer edges of the cornea). These features pose a challenge for humans to determine and quantify in regards to assessing heart disease risk.
"The algorithm had a moderate performance, and additional clinical information did not improve its performance, which means it could be used easily to predict potential heart disease based on facial photos alone. The cheek, forehead and nose contributed more information to the algorithm than other facial areas. However, we need to improve the specificity as a false positive rate of as much as 46% may cause anxiety and inconvenience to patients, as well as potentially overloading clinics with patients requiring unnecessary tests."
Source: Science Daily