Google's Pixel 4 will include a radar sensor — here's why that could matter for health

A proof of concept prototype has already used the same Project Soli sensor to measure blood glucose.
By Jonah Comstock

At its Made By Google unveiling event in New York City today, Google announced that its Pixel 4 phone will be the first mobile phone with a radar sensor. It's the first public rollout of the company’s Project Soli, an experimental, radar-based gesture tracker co-developed with Infineon.

When VP of Product Management Sabrina Ellis announced the Pixel 4’s motion-sensing feature, she showed off applications that allow users to swipe and dismiss notifications without touching the phone, or wave to an animated Pikachu who waves back.

The company is also aware of potential privacy concerns, so the radar feature is entirely contained on the phone — and never transmitted elsewhere.

WHY IT MATTERS

Today’s announcement was not around health, except for a quick reference to “personal wellness” as part of Google’s future plans for the technology. However, there’s more reason to think health applications could be around the corner.

For one thing, digital health has a long history of exploring gesture tracking, largely via the now-defunct Microsoft Kinect and similar sensors. The technology has applications in physical therapy, sterile controls for surgeons and noninvasive health monitoring, among other things.

In fact, last year researchers from the University of Waterloo have described a proof-of-concept system that uses Google’s Soli to track concentrations of glucoses within a solution. With further investigation and adjustment, this prototype technology could potentially be implemented for diabetes patients as a novel noninvasive approach to managing blood glucose levels.

That’s not to say that out of the box a Pixel 4 will be able to noninvasively monitor blood glucose. But it does demonstrate the potential of the underlying technology.

WHAT’S THE TREND

The 2017 discontinuation of Kinect was a blow for healthcare use cases of gesture tracking, but not a fatal one. We spoke two years ago with a handful of companies in that space who were still going strong. And earlier this year, Microsoft announced the technology is making a comeback.

But moving the technology to a phone is significant. If this technology becomes part of the table stakes of mobile devices, it will open up new possibilities for scale for companies that use gesture tracking. This could include companies like AI.Cure, which currently use the phone’s camera to hold users accountable for medication adherence, or Klue, which currently uses wearables to monitor eating and drinking behaviors as a proxy for tracking calorie intake.

ON THE RECORD

“Radar’s been around for a long time and it’s still one of the best ways to sense motion. It’s precise, it’s low-power, and it’s fast. … But radar sensors have always been way too big to fit in a phone. So we shrank it down into a tiny chip,” Ellis said at the event. “The Soli team is working on a wide range of helpful new features, from gaming to personal wellness.”