Lessons Learned: What NOT To Do When Integrating Digital Health Devices Into Trials

Pharma companies that are now incorporating digital health devices in clinical trials need to address five important lessons before moving forward.
Sponsored post by
By Validic
Share

By: Jennifer Plumer, Director of Market Development, Validic

So you’ve decided to include digital health devices in your trial. While that’s a great step to take as an organization, and an important step towards innovating the R&D process, it’s just the beginning.
 
You now have to select a device that will best suit the needs of both the trial and the patient and think through nuances with data collection and analysis. It’s easy to overlook some of the key items that need to be considered. And organizations do this all of the time because that’s not the core competency of drug developers (nor should it be).
 
The good news is you don’t have to go into this process blindly. According to a recent research report by Validic, 64 percent of pharma companies have already used digital health tech in trials. This means you can take advantage of the lessons learned from those that have gone before you. Here are 5 lessons learned, or things you should not do when integrating devices into trials.
 
Lesson # 1 - Don’t assume all wearables are created equal.
 
Device classification. Before you select a device, you’ll want to thoroughly think through exactly what the needs of the trial are. Do you need a FDA (Federal Drug Association) Class II medical device to collect a primary endpoint? Or will a consumer device suite your needs? There are tradeoffs either way. Class IIs have more stringent guidelines that make the data they collect better for approval submissions, but consumer devices tend to be much more cost-effective and engaging for the patients.
 
Battery life. Can you rely on your trial population to charge their wearable device? Many wearables need to be charged every night, while others can last multiple days or weeks.  If the battery dies, you’re not collecting data until it is charged again.   
 
Data storage. How much data the device can hold must be evaluated along with how frequently the data will be synced from the device. Depending on the device and the what it is tracking, data may be lost after a week.  While you may expect data to sync throughout every day, you should be prepared for how to handle the exceptions.
 
Blinded device. Some device manufacturers can update firmware to blank out the device’s screen if that is necessary for a blinded trial. Keep in mind if the patient cannot interact with the device, it may lower engagement and likelihood of using the device regularly. 
 
Endpoints. As sensors are becoming increasingly smaller, devices are capable of tracking more endpoints. Consider the endpoints that the device collects, and think through individual endpoints to determine if each meets the criteria needed for the study. If you choose a wrist-worn consumer device that collects activity, sleep and heart rate, all of those endpoints may not be sufficient for your trial needs. You may determine the activity and sleep data from a consumer device to be sufficient for collecting exploratory endpoints, but decide to use a Class II device for tracking heart rate so that it can be used as a primary or secondary endpoint.      
 
Lesson #2 – Don’t assume all of the data is available.
You may see REM sleep, heart rate variability (HRV) or second-by-second heart rate within the device’s mobile app and assume that data is available to be passed into your system. However, just because that data is being collected by the device or is available within the mobile app, doesn’t mean the manufacturer passes it in their standard data set.
 
It’s also easy to think that the data will come through with high granularity, but this is not necessarily the case. You might receive a daily summary instead. For example, you may expect to see a granular report indicating exact periods of sleep within a 24-hour period, but instead, receive a report stating that the patient slept for seven hours that day. The reality is a majority of device manufacturers don’t provide granular details today, but they’re open to providing it, as there is an increasing need for it. 
 
Lesson #3 – Don’t assume you know what the data means.
 
Every device defines measurements differently. Active time on one activity tracker may mean every second you’re not sitting. Whereas another tracker only counts movement as active time when your heart rate is at zone 2 or above. 
 
Resting heart rate (RHR) is another great example. On one device, RHR is defined as your heart rate when you first woke up in the morning. Other devices measure RHR as the lowest heart rate throughout the day. While The Consumer Technology Association (CTA) is developing standards around how device manufacturers define heart rate and other measurements, data definitions cannot be assumed. 
 
Lesson #4 – Don’t focus on only the absolute values.
 
It’s easy to make the mistake of focusing too heavily on absolute values. Many of the early examples of digital health devices in clinical trials analyzed the exact output: number of steps taken or number of hours slept. But what do those individual data points mean?
 
Researchers can glean more valuable insights from device data than just the number of steps taken in a day. The data become much more interesting when activity and sleep data are correlated with other data being collected, inside and outside of office visits. This correlation can provide context as to why activity, sleep or heart rate levels may have fluctuated. Sponsors can then uncover important patterns such as a participant being less active on days that a medication dose is missed or a participant sleeping more after taking the medication, indicating drowsiness as a possible side effect.
 
Lesson #5 – Not asking for exactly what you want.
 
While many researchers have already begun utilizing digital health devices in trials, we are still in the early stages of this technological disruption. Researchers are learning the ins and outs of patients using devices and the nuances of collecting and analyzing data. And device manufacturers are also learning about the needs of researchers in the clinical trial setting and evolving their product offerings as a result.
 
It’s important for researchers to voice needs and concerns to device manufacturers so they can better understand what is required in the clinical trial setting. If a device is not currently capable of the functionality you need (i.e., a blank screen or a way to determine if the device is on the wrist), ask the manufacturer if that can be done. They may not always be able to accommodate a need, or at least immediately. But by posing the question, you could get exactly what you want.
 
Partnering with the digital health community
 
As with any new disruptive technology, there is a period of learning and adjustment. While it may seem like a daunting task to begin integrating devices into trials, the game-changing benefits and insights that the data provide outweigh the risks. Fortunately, there are many resources for researchers to access.
 
The FDA has stated its openness to pharma companies using digital health data in approval submissions and willingness to discuss concerns.
 
The Clinical Trials Transformation Initiative (CTTI) has a dedicated working group, Mobile Clinical Trials (MCT). The goal of the project, according to their website, is “to propose recommendations that address the scientific and technological challenges inhibiting the widespread use of mobile devices in clinical trials.”
 
There are also digital health platforms that are helping researchers integrate device data into trials, understand and navigate challenges and data nuances, select devices and work with device manufacturers on unmet needs.
 
By working with these industry associations, capitalizing on the expertise of technology companies and learning from trials that have already utilized devices, pharma companies can implement digital health initiatives quicker, more seamlessly and with fewer pitfalls.  
 
About Jennifer Plumer
Jennifer is the Director of Market Development at Validic, the industry’s leading digital health platform. Validic connects actionable data from fitness and sport wearables, clinical devices, biometric sensors and mobile applications to healthcare, pharma, wellness and sport companies. Jennifer has more than ten years of experience building and executing strategic go-to-market and demand generation plans for B2B companies.

Tags: 
pharma