Digital health technologies come with 'responsibility of imagining unintended consequences'

Speaking today in Boston, Luminary Labs CEO Sara Holoubek urged listeners to embrace ethics-focused practices within their organizations.
By Dave Muoio
Share

From Facebook’s Cambridge Analytica woes to news of the first deaths attributable to autonomous cars, the tech industry has lately found itself engulfed in moral controversies.

Unfortunately, the health tech sector is no different. Just this year, multiple consumer exercise apps revealed classified GPS and identifying data through their APIs, genetic testing companies continue to grapple with data sharing best practices, care disparities are potentially being fueled by machine learning — and the list goes on.

“These new technologies hold great promise, but with these new technologies comes the responsibility of imagining unintended consequences. And in 2018, we are not short of stories to tell,” Sara Holoubek, CEO of business and tech consultancy Luminary Labs, said during a talk on health innovation ethics held this morning at Massachusetts General Hospital. “People have been thinking about the unintended consequences of technology for a long time. They just don’t live in Silicon Valley.”

Recent years have seen the dawning of what Holoubek referred to as a “new age of digital ethics,” a time when convenient devices and services are ubiquitous in daily life with little understanding among the average user of their wider implications.

“It occurred to me that we as a society do not yet have a common and shared framework to navigate the world we have programmed, right? If you’re a technologist you know that your data is how they monetize everything, they’re taking your data, but most people do not know that,” she said. “And in the case of the [Uber] death, we have swallowed this pill that autonomous vehicles are safer, when in fact they have not yet been deployed. It is a hypothesis.”

Health tech may not be exempt from these challenges, it does have a leg up thanks to the ethical and privacy safeguards already established in medicine, such as the Hippocratic Oath and Institutional Review Boards, she explained. As the field continues to develop at a rapid pace, Holoubek gave audience members five recommendations for how leaders and organizations can ethically and responsibly build their innovations:
 

  • Read more science fiction. From George Orwell to Margaret Atwood, writers have penned a myriad of ways that new technologies can get out of hand. In fact, some innovation centers have brought science fiction writers onboard to guide their efforts. For everyone else, Luminary Labs’ blog has curated a list of 40 science fiction books, films, and television shows for those looking to flex their tech ethics muscles.
  • Get to know tech ethicists. A number of academics and innovators are leading discussions and raising the alarm with each new tech announcement. Holoubek recommended organizations get in contact with these thought leaders and potentially include their feedback in new designs.
  • Take, or create, an ethical oath. Whether its adopted at the industry or company level, a guiding code of ethics can help organization members high and low on the corporate ladder maintain practices that limit unintended outcomes, Holoubek said.
  • Engage with frameworks to spur the ethical imagination. Several toolkits are already out in the wild for organizations to reference as they’re developing new technologies. Here, she spotlighted Ethical OS’s eight-part toolkit as a resource for those looking to avoid unintended ethical outcomes as a result of their in-development offerings.
  • Join the community. Whether it’s through Twitter or a more organized collaboration, such as UCSD’s Connected and Open Research Ethics (CORE) group, engaging with like-minded individuals in tech or health can lead to new ideas and insights, Holoubek said.

Importantly, Holoubek told audience members who may be part of a startup or larger enterprise that this kind mindfulness can’t be an afterthought if it has any chance of making an impact.

“If you’re thinking that you’re going to think about ethics later, it’s too late,” she said. “That’s so far down the chain. It has to start at the top. And, if we’re looking at the makeup of boards — one of the reasons you want more diverse boards is to have more lived experiences. If nobody at the board level is thinking about it, why should your CEO care?”

Admittedly, most individuals and consumers have traditionally had little impact on how tech leaders are making their decisions outside of deleting an app or voting with their wallets, Holoubek said. In the past couple of years, however, she said that she has been encouraged by those who have managed to affect change by forming vocal, concerted movements.

“I do believe in this idea of popular uprising and demanding change,” she said. “[For instance,] some employees at Google have asked Google to not work with particular parts of the government because they do not want their technology to be used for harm. … Someone circulated a petition: Do not, as an angel investor, invest in companies that do XYZ. I think this is a very interesting political time we’re living in, and the power of collective voices has resurged.”