In-Depth: Advances and challenges in digital dermatology

By Jonah Comstock
Share
a mobile app for digital dermatology

Alexander Börve’s company FirstDerm offers online consultations with board-certified dermatologists and a free product that uses AI to give users ideas about what their skin condition might be. But Börve’s background isn’t in dermatology. In fact he’s an orthopedic surgeon by training. He had the idea to build his app because he realized how pervasive the need is for dermatological consultation.
 
“I had a girlfriend who was a dermatologist and she didn’t like to go to dinner parties because every time she presented herself as a dermatologist, people would ask her for free advice on moles,” Börve told MobiHealthNews. “And if people had had a bit of wine, they would ask her to come to the bathroom and take some of their clothes off and show her something more intimate. So that’s basically where my idea of teledermatology came from.”
 
If people will show their moles to a stranger at a dinner party, a stranger on the internet doesn’t seem like a big stretch. And Börve isn’t the only one who thinks so. Since the beginning of health apps, there have been products designed to tell the user if a mole is cancerous. But those apps have also served to illustrate the dangers of mobile health.
 
Two such apps, Mel App and Mole Detective, were the target of action by the Federal Trade Commission in 2015. In early conversations about FDA regulation of mobile health, mole apps were the go-to example for an app that presented a mortal danger. If an app told someone a mole was low risk and it turned out to be cancerous, shouldn’t the FDA have regulated that app, Congressman Henry Waxman asked the FDA in a 2013 Congressional Hearing.
 
But none of that has stopped developers from tackling the space. We catalogued some of the storied history of apps claiming to detect skin cancer just last year when Doctor Hazel, another such company, participated in a TechCrunch hackathon with a claim of 90-percent-accurate skin cancer detection.
 
In 2018, dermatology is still considered a prime area for digital health intervention, as evidenced by the recent formation of a $1 million Advancing Innovation in Dermatology Accelerator Fund, a joint non-profit venture by LEO Pharma and Advancing Innovation in Dermatology.
 
“One thing about skin is it’s visual,” Dr. William Ju, president and a cofounding trustee of Advancing Innovation in Dermatology, told MobiHealthNews. “It’s very amenable to an app on an iPhone taking a picture. It’s a visual specialty. So there may be a very nice match with dermatology and the technologies that are coming out, mobile in particular.”
 
Although a number of apps of dubious legitimacy are still flooding the app store, the field has proved both an extremely promising one technically and a challenging one to monetize. The biggest areas of interest today are around teledermatology and AI for cancer detection, but innovators see other technologies, including augmented reality, as possible use cases for the future.

The Bad Business of Teledermatology

When a user visits FirstDerm’s website or opens an app, they can upload a picture of their skin condition and a short description without paying a dime. It’s only at the end of the process that they’re offered the chance to pay between $29 and $59, depending on expediency, to send those photos to a dermatologist for analysis. But Börve says that very few do.
 
“The interesting thing there is 85 percent of our users, they drop off on the payment screen,” he said. “So nobody wants to pay for healthcare. Everybody wants someone else to pay for healthcare.”
 
What most casual users of dermatology services want is reassurance, Börve said, but it’s not worth a large out-of-pocket payment.
 
“Nobody likes to go to see a doctor,” he said. “The ones that go to see a doctor, it’s because of pain. Pain is the thing. You have a backache, you break your leg, you have an infection, you have something in your eye, it’s pain. Pain drives people to pay. Just having a rash isn’t painful so people say ‘It’ll go away. I won’t worry about it.’”
 
FirstDerm has about $900,000 from angel investors and accelerators, but has accepted no venture capital. That’s why, Börve says, they’ve survived in the direct-to-consumer space while most of their competitors have had to pivot into B2B. Direct-to-consumer teledermatology can be profitable, but it’s too slow a return for VCs to show much interest.
 
Michael Sierra, vice president of LEO Science and Tech Hub, echoed those sentiments when talking about why they decided to launch their new accelerator.
 
“There are some ideas that are very powerful from a scientific engineering innovation perspective. They’re not quite ready for traditional forms of funding — whether it be VC funding or even angel — and they’re past the point of government grants, so they’re sort of in the Valley of Death,” he said. “So our thought was that’s an area of need that we might be able to address with the accelerator. So we are looking at early-stage, high-risk projects, but ones that really can be transformational.”
 
In the meantime, other companies are seeking better business models for teledermatology. Telederm UK, which works with the NHS, offers its services to primary care physicians, who can photograph a skin condition in the office with a dermatoscope and send it to Telederm’s experts.
 
Working through doctors is not only an easier business, it potentially yields better results according to Maria Charalambides, a University of Birmingham researcher who recently conducted a literature review on dermatology apps in collaboration with the British Association of Dermatologists.
 
“You can’t really give a clinical background or history to the photo via an app. What you can write is quite limited. In clinic, if a patient is 40 years old and they’ve had this number of new moles appear and they’ve changed this quickly over years and months, they’re all worrying signs, but you don’t really get that with an app. As opposed to ... Telederm UK, which is regulated, it’s part of the NHS, and [general practitioners (GPs)] are using it in a regulated way, and patient consent is also obtained appropriately in terms of storing and sending images. There’s potential for that in terms of saving time and filtering out patients who don’t necessarily need a face-to-face with a dermatologist.”
 
Going through alternate payers is not a perfect solution either, Börve said, as many employers and insurers either want a more holistic telemedicine solution or are more focused on chronic conditions like diabetes or cardiovascular disease, which make up a much larger portion of their costs.

Low-Hanging App Fruit: Tracking and Measuring

 An app doesn’t have to tell a user whether they have cancer to be useful in dermatology care. Apps can go a long way towards solving a problem that has long plagued the space: accurately measuring moles and lesions, as well as tracking their change over time.
 
“There are apps that literally just function as photo storage,” Charalambides said. “So every month they can give you a notification to take that photo of the mole that’s worrying you and you can kind of have a photo diary every month of the mole so you can see how much has changed.
I guess that can be really useful if you take it to a dermatologist and say ‘I noticed that this has changed.’ So in terms of prevention of melanoma and catching potential melanomas early, that app can be useful.”
 
Technology can help replace a standard of care in dermatology that’s currently about as low-tech as it can be.
 
“What we do with dermatology is we take a ruler and we try to measure the length and the depth and the width, and we can’t do it very well,” Ju, from Advancing Innovation in Dermatology, said. “With a digital image, you can get the true surface area and the true volume.”
 
Healthy.io, a company that currently uses the smartphone camera for urinanalysis and wound care, has hung back from the dermatology space for a number of reasons. But his company is trying to solve similar measurement problems in wound care.
 
“I can send you a body of articles that show when three nurses measure the same wound they get different results, significantly,” Healthy.io CEO Yonatan Adiri told MobiHealthNews. “It’s a black hole of data. A nurse can measure, she’ll decide on the treatment for the next week — treatment can be a $500 bandage — but no one ever saw what she saw, right? … She put the data into a medical record, in a good case, length, width, circumference, tissue change, etc. But it’s all subjective.”
 
In the clinic and at home, smartphone cameras and computer vision can start to address these problems. At WWDC this year, Apple debuted an app that would use augmented reality to accurately measure three-dimensional objects. This technology could be useful for mole tracking, Ju said. Sierra, from LEO Pharma, thinks AR could have other applications as well.
 
“One of the things we’ve been looking at, or at least thinking of, is allowing dermatologists and patients to actually, using the augmented reality, be able to create virtual patients, [showing them] what the skin condition looks like and being able to mirror that onto the skin so that you can compare,” he said. “And the same thing [works] with dermatologists and physicians, especially for GPs who are not dermatologists, allowing them to use some kind of virtual patient where you can see what the disease looks like and start comparing.”
 
That technology could also help educate patients on what skin should look like when it’s healing, to give them confidence in their treatments.
 
“Sometimes the skin becomes red as it’s getting better because it’s healing,” he said. “But if you don’t know you look at it, you think ‘Oh my god, it’s getting worse. Is it supposed to look like this?'”

What’s Next: AI Gets Smarter

In February 2017, the cover story in Nature displayed a smartphone screen, on it a mole, a red square, and a label: melanoma. Inside, a groundbreaking work from Stanford University showed that a convolutional neural network trained on 129,450 clinical images could go head to head with 21 board-certified dermatologists when it came to distinguishing between a benign mole and a malignant melanoma.
 
Since then, academics and private companies alike have been trying to commercialize that technology, bringing the cancer-detecting mole app closer than it’s ever been. But as well as showing what’s possible, the Stanford study showed what was required: lots and lots of robust data.
 
“Getting the data is really the biggest challenge, not the AI,” Karen Panetta, IEEE fellow and dean of graduate engineering at Tufts University, who studies AI use cases in healthcare, told MobiHealthNews. “We’ve already got the models, we just need more training data to validate this expertise. And then, again, getting doctors to also validate, to get random things from a cellphone, and you want multiple doctors to do it because they have to agree.”
 
There are a lot of reasons good data is hard to find. One problem is there’s a lot of noise in skin images. In one case study on First Derm’s website, a user submitted a photo of red splotching, but the AI focused instead on a mole that happened to be in the picture. Ironically, the splotching turned out to be likely nothing, but the mole registered as potentially cancerous.
 
“One of the first tasks that we had is what do you do is when people have hairy arms?” Panetta said. “You’ve got problems like that, there’s things that obscure the actual [lesion]. Those are the problems that we’re working on to actually make it work.”
 
Similarly, it’s important to train any network on people of different races and skin tones, to make sure the tool works equally well for all potential users.
 
Another problem is that clinical images like the publicly available ones used in Stanford’s study were taken in a clinical setting and won’t necessarily be similar enough to ones taken on people’s smartphones to be an effective training tool.
 
There is no public database of smartphone images, but companies like First Derm have a good idea of how to build one. That’s one reason why First Derm collects the images first and then asks users to pay.
 
“When we come to the payment screen we’ve already captured their images,” Börve said. “So since these images are anonymous and also our terms and conditions say we can use it for research and to make the service better, we can use these for machine learning. So we have dermatologists that analyze these images and we use them for machine learning.
”
 
Börve thinks that dataset is the real gold mine his company is sitting on. He used the analogy of Google Maps.
 
“If you think about Google Maps — Uber, Lyft, everyone is using Google Maps,” he said. “People are using their API, and when they use the API, they’re feeding more information to Google Maps, which is making Google Maps even better. So nobody can compete against Google Maps. Apple tried to do it and they failed miserably because their API isn’t distributed the way Google’s is. If we can distribute our API we’ll get more data, which we can pour back into machine learning and then our AI will just get better and better and in time we will increase the number of skin diseases that we can master.”
 
His hope is to license an API out to general purpose telemedicine companies like Teladoc, which will then be able to offer First Derm-powered AI dermatology in their own apps. Börve said First Derm has a head start because most AI researchers are focusing on a binary output: cancer or not cancer. First Derm is looking to use AI to tell users what they have, whether it’s acne or eczema.
 
But while such AI is undoubtedly getting better — Panetta thinks the technology is only a year away from commercialization — its public perception hasn’t moved much since 2013.
 
Charalambides’ lit review points to a huge number of dubious apps on the app store promising to deliver melanoma risk levels. She, and the British Association of Dermatologists, are warning consumers away from those apps until there’s some way to regulate them.
 
“In terms of apps that calculate risk, that’s where the dangers arise,” she said. “These aren’t regulated apps, these apps can give false negatives, they can misdiagnose. They can give false positives as well, which then cause alarm bells, though that’s obviously better in a sense than false negatives. And with apps as well, some issues with patient consent and confidentiality might arise. What’s happening to these photos after you upload them onto the app?”
 
Even well-meaning apps — designed, like First Derm, to build up enough data to train a network — deal with the false negative problem. That’s something Panetta is thinking about as she prepares to build a similar app.
 
“Imagine if we said ‘no you don’t have skin cancer’ and they do,” she said. “Detection is crucial, so there’s a liability issue that you also have to worry about. When we deploy something like this there has to be caution that this is research and until we can validate these results with real medical professionals, it’s just experimental. You don’t want somebody to see this app and say ‘Oops it says I’m safe, I’m going to forget about it.’ There’s always going to be some risk until we get it to 1 million test cases.”