The use of artificial intelligence to diagnose and monitor health conditions is on the rise. At its most basic level, AI uses outside data from which to learn and make predictions about particular problems and scenarios. The better the dataset, the more AI can learn, allowing for more precise predictions.
In healthcare and medicine, AI has been applied in a number of contexts: it’s been used to make predictions about which drugs will be successful in clinical trials; it’s also been used to generate clinically valid and effective treatment plans. Overall, research suggests that AI can perform at least as well, if not better, than humans when it comes to making decisions about diagnosis and treatment.
This same technology has been applied to the detection of cancer, with significant success. When it comes to skin cancer, however, researchers suggest that there is a serious lack of diversity in the data AI uses to make diagnostic decisions. Specifically, there is a lack of diversity in the datasets of skin cancer patient imagery used to “teach” AI programs.
In a new study published in Lancet Digital Health, researchers reviewed all the freely available data sets of skin lesion images, analyzing over 100,000 pictures. Researchers found that much of the data available in these sets was lacking important contextual details. For example, only 2 of 21 sets of images included images taken with a dermatoscope, a specialized tool used to take pictures of skin lesions that could be cancerous.
The most interesting finding was around ethnic information contained in the images. Among pictures that had skin color noted, only 11 out of 2,436 were images of brown or dark skin, raising red flags for researchers.
Dr. David Wen, the study’s lead author, noted that in “the majority of datasets, lots of important information about the images and patients in these datasets wasn't reported. There was limited information on who, how and why the images were taken. This has implications for the programs developed from these images, due to uncertainty around how they may perform in different groups of people, especially in those who aren't well represented in datasets, such as those with darker skin. This can potentially lead to the exclusion or even harm of these groups from AI technologies.”
AI is only as good as the data it has available to it—without a diversity of patient imagery to help diagnose skin cancer in a wide range of patients, AI’s real-world applications are limited.
Sources: Science Daily; The Lancet Digital Health; Future Healthcare Journal