One promising area in which artificial intelligence is rapidly advancing is computer vision, algorithms which process images. Healthcare entrepreneurs are in the midst of turning this technology toward healthcare, where algorithms can identify rashes and lesions, measure and analyze wounds, and bring colorimetric testing into the home — all using photos or short videos snapped by our smartphones.
“There’s a unique confluence here that is happening today that has allowed [these companies] to start to exist, which is the intersection of mobile, the prevalence of it, the power of those mobile devices being able to start to deliver on AI and computer vision at the bedside, and the advancements in frameworks especially in and around [AI],” Carlo Perez, founder and CEO of AI-powered wound care tech startup Swift Medical, said. “Think about where we’ve come from , when the world was aghast that Google Brain could find a cat in a YouTube video. We’ve advanced to the point where we can deploy this technology at the bedside en masse and actually bring it to bare on a specific, meaningful application.”
MobiHealthNews dove into the world of AI for dermatology a few months ago — read on below for a look into the world of computer vision for wound care.
A black hole of data
Wound care is an attractive area for AI because the status quo leaves so much to be desired.
“Last year in the UK, the cost of chronic wound management was 5.4 billion pounds,” Yonatan Adiri, CEO of Healthy.io, said. “That’s about 5 percent of the British national healthcare spending. In England, because it’s the government and it’s a single-payer system, they actually measure every pound, every penny. The second piece was, instead of 5 billion pounds, 10 years ago it was 2.2 billion pounds, so this cost is exploding. And the third piece was the current way of measurements, and the current way of monitoring the wounds, is really inaccurate, non-repeatable and the result is that instead of having healing time [for] pressure wounds and skin ulcers of, let’s say six to 10 weeks max, at a certain price point, we’re seeing about 4,000 pounds per wound and 16 weeks because the nurses just can’t handle the throughput.”
Not only is it expensive, current methods of measurement are incredibly imprecise. And when individual measurements are imprecise, measurements of change over time are not very useful.
“Literally the status quo for today is you’ll enter the wound care center or if you’re in home health the home health person will come in, they’ll unbandage the wound and they will take a paper ruler and place it across the wound,” Perez said. “And they’ll end up with 44 percent error from one measurement to the next, eyeballing where that is. The outcome of that is, over time, with that amount of error, one doesn’t understand whether the wound is closing or whether the treatment is the most effective that it can be.”
That’s not even including the standard of care for depth measurements, which require lowering a wooden stick into the wound.
“There is a black hole of data here,” Adiri said. “We know that measurements are non-repeatable — I can send you a body of articles that show when three nurses measure the same wound they get different results, significantly. It’s black hole of data.”
Tech to the rescue
Companies like Swift Medical, Tissue Analytics and Healthy.io are building systems that replace that inaccurate manual measurement with computer measurements. By taking a short video of a wound rather than a still image, computer algorithms can actually make a 3D measurement, allowing them to automatically measure length, width, surface area and even depth.
“We took the approach from the beginning that we wanted to fully automate wound measurement,” Kevin Keenahan, CEO at Tissue Analytics, said. “So you take a picture or you take a video, and everything the nurse is doing today, including length, width, surface area, steps, all that stuff, we want it to be fully automatic. … We picked the harder approach there, and it’s been a long road, but at the end of the day I think we’ve got that clinical stickiness from a user point of view. It’ll be a tool that they’ll use not just for a couple weeks but for years.”
At least in the short term, none of these companies are thinking of their computer vision technology as a direct-to-consumer offering. Instead, they hope this is a case where AI can enhance the abilities of doctors and nurses.
“Our framing for this vision or domain is that measuring of the chronic wound should change from rulers to smartphones,” Adiri said. “If we are successful in doing that, then the impact from a patient journey, from a healthcare system’s capability to curb these inflating costs, would be massive. In this field we don’t believe in direct-to-consumer products in the vision of the future of the short- to mid-term, we basically want to allow our nurses to become super nurses through allowing for their smartphone cameras to assist them in the process, and also make the 10-minute process, which is non-repeatable, to a three-to-five-second process that is fully repeatable.”
The role that AI in particular has to play here is that deep learning and convolutional neural networks allow these companies to train their algorithms faster than they ever could have even a few years ago.
“At Swift one of the tricks that we’ll do is as soon as you wave the phone around the wound, we’ll automatically measure that wound, which is to say it will identify the wound and provide a 3D reconstruction and it can do that on the phone or any mobile device,” Perez said. “But what’s happening under the covers there, … 10 years ago that would have been done exclusively using computer vision algorithms. As in, a team of engineers would sit down and tell the computer what to look for. Instead of doing that, what we’ve been doing and the way we’ve been able to grow in such a short amount of time is we’ve been able to apply AI algorithms and convolutional neural networks to allow the machine to identify where the wound is inside of the picture, and that is a massive advancement.”
In order to train algorithms, of course, companies need a certain amount of training data. In order to get that data, they’ve had to launch early partnerships like Tissue Analytics' partnership with Wound Care Advantage.
“We employ an online learning system, so as we get more measurements, as we get more images, we’re able to train those in real-time and add that to the training dataset. And then that data reinforces the system, makes it more powerful,” Keenahand said. “Computer vision is going to require a larger dataset because of the sheer heterogeneity of images. Wounds seem like they’re maybe pretty similar, but there’s actually 100 different types of wounds out there and they can all look very different from each other.”
End-to-end solutions are a must
While waving a smartphone over a wound and getting an accurate measurement is the flashy part of the technology, all three companies told MobiHealthNews that in order to get any traction, they needed to build out a lot more.
“[We realized] this was a much bigger problem than just measuring,” Perez said. “This was something that required a holistic solution, not just for the clinic in terms of documentation, but across the continuum of healthcare. … So that was what we built, a solution that uses AI computer vision at the point of care to automate and standardize the wound care documentation workflow.”
Swift Medical’s offering includes a cloud backend for the storage of measurement data, and end-to-end workflows for wound nurses. Similarly, Tissue Analytics' offering not only allows nurses to easily measure the wound, but also includes a full process for storing and tracking that data, including Smart on FHIR workflows that integrate with their customers' EHR.
These kinds of considerations are important, because EHRs often aren’t equipped to deal with the new kinds of data AI systems are creating.
“Everything we do around dermatology, if it’s remote care, if it’s rashes, everything, usually it’s done by an identification of a medical practitioner, but there’s no records,” Adiri said. “We have never encountered a medical record that has imagery as part of the medical record as standard.”
Finally, companies are also moving into that next step, beyond measuring and tracking and into identification. As the computer learns to look at different kinds of wounds, companies like Tissue Analytics are working on also letting it offer analysis and treatment suggestions.
“We’ve been in this category where we use the AI to speed up workflows,” Keenahan said “But now that we’ve been in the market long enough, we have partners like Wound Care Advantage and there’s a company called Mölnlycke. Both our partners have been really helpful in helping us launch clinical decision support capabilities, so after you get your measurement and you have good idea now of how the wound is progressing, if it’s worsened, the same, or better, you can have a little more insight into what the best product category would entail.”
As companies continue to perfect their technology, Swift Medical, Tissue Analytics and Healthy.io are all looking to create seamless, end-to-end wound care offerings — with AI as the system’s invisible heart.
“You fall in love with your technology, but healthcare is not about technology,” Perez said. “The future of healthcare is about how do you bury technology like AI so it fits inside a seamless workflow that is empathetic to the user. …It’s about how to you hide that technology and how do you design with empathy to ensure that workflows are faster, easier, and [users] can just put the technology away and spend their time providing better care.”
Focus on Artificial Intelligence
In November, we take a deep dive into AI and machine learning.