How to Protect Yourself from the Siren Song of Healthcare IoT

IoT promises to transform healthcare, delivering better care to more people. But there are significant risks. Just imagine the risks associated with connected ingestibles. When revolutionizing the healthcare industry, we must remember to align our priorities around the immortal words of Hippocrates' oath: “First, do no harm.” Innovate with care.

Kristina Podnar
Image of siren statues
Illustration by © IoT For All

When I attend technology conferences lately, I find myself flashing back to the heady days of the early 1990s, when the internet was the final frontier, and anything was possible. After a while, of course, we got smacked on the nose by reality and started asking those “but what about…” questions that make product development less fun, but more effective (and, ultimately, more profitable).

Today, the Internet of Things (IoT) is in much the same place. Connected devices are transforming the world in ways we could barely have imagined back in the 1990s. Here’s a short sampling of how IoT is already changing industries and lives (Hint: Alexa and Google Assistant are still riding with training wheels).

  • In the agricultural industry, farmers are increasing productivity and decreasing costs with IoT tractors that not only drive themselves but use algorithms to calculate the best routing based on things like number of vehicles, vehicle turn radius, etc.
  • In education, IoT devices enable task-based learning. Instead of listening to one-size-fits-all lectures, students work at their own pace via connected devices (e.g. performing a virtual dissection), and the devices notify teachers when students need extra guidance. And wearable devices take over the more tedious tasks, like taking attendance and recording absences.
  • In industrial environments, IoT devices can aid in scheduling and reducing downtime by combining historical records with real-time data to predict breakdowns and schedule preventive maintenance.

I’ve noticed that the greatest excitement is focused on an industry that affects each and every one of us: healthcare.

IoT and the Bright New Future of Healthcare

IoT is already improving care and allowing it to be delivered to more people. People in rural areas, for instance, can avoid long and expensive trips to a medical center. ER visits are reduced when doctors can monitor patients’ vital signs remotely.

And we’re just getting started. It’s an exciting time. It’s no wonder that many entrepreneurial minds are gearing up to participate. Nonetheless, I encourage everyone from developers to healthcare providers (and consumers, too, for that matter) to stop for a moment and remember the words uttered by Hippocrates and immortalized in the oath named in his honor: “First, do no harm.”

What Harms Could Stem from Healthcare IoT?

It’s a little thing called “unintended consequences.” They tend to pop up when we get so excited about our ideas that we don’t stop and think about the possible downsides. And the healthcare IoT is full of them. That doesn’t mean we shouldn’t strive for innovations that make people’s lives better; it just means that, before we get too far along in the development phase, we should ask ourselves, “What could go wrong?”

Let’s look at one small slice of the healthcare IoT pie: ingestibles. Doctors at the University of Minnesota Health and Fairview Health recently announced that they’re treating a small group of cancer patients with “digital medicine”: a chemotherapy pill that includes a sensor to let patients and their doctors monitor their dosage—to make sure they’re taking their medicine when they’re supposed to.

That sounds like a good thing, right? Cancer is nothing to mess around with, and complying with a treatment protocol is important. In fact, there are many conditions that could benefit from such real-time feedback. The Centers for Disease Control and Prevention (CDC) estimates that 20 percent of the 3.8 billion prescriptions written in the U.S. each year are never filled. Of those that are filled, half are taken incorrectly, especially when it comes to timing.

The cost of medical noncompliance is shocking. A report in the Annals of Internal Medicine estimates that prescription noncompliance costs the American healthcare system more than $280 billion per year and is responsible for about 125,000 deaths.

Looking at those numbers, it’s hard to argue that ingestibles that monitor compliance would have a downside. But, going back to the, “First, do no harm” mandate, developers and health care providers would be wise to take a closer look.

How could such products cause harm? Let’s start with these.

Security Risks

One of the biggest concerns developers have about IoT is weak security. A large part of the problem stems from the millions of connected household devices. 15 percent of all IoT device owners never bother to change the default password. Even somewhat competent hackers can use a mere five username/password combinations to access an astonishing number of DVRs, security cameras, and yes, even washing machines.

Now extend that line of thought to connected devices people carry around inside of their bodies. Health information is considered to be some of the most personal data there is. If those ingestible devices aren’t properly secured, could people unknowingly be broadcasting their health status (not to mention all of the other personal data related to it) everywhere they go?


If you’re going to develop ingestible IoT devices, don’t skimp on security, from collection and transmission to storage and accessibility. The risks to consumer privacy and to your organization should a breach occur can’t be underestimated. (And I strongly encourage you to include your legal team in all of the conversations we’ll cover here.)

Privacy Concerns

I would expect most developers to understand that any connected healthcare device would be subject to HIPAA standards. The problem is that we haven’t figured out exactly what that means when it comes to things like ingestibles.

  • Who owns the device? The manufacturer, the doctor, the healthcare system, the insurance company, or the person whose body it’s in? And, if the manufacturer and/or doctor retain ownership, can they retrieve it at will, even if that means forcing a person to undergo an unwanted medical procedure? How would that be enforced? Through the courts?
  • What about the data on the device? Regulations like the HIPAA and the GDPR suggest that the data would belong to the individual. So how would that work when it comes to gaining consent? Will your Terms of Service state that you can use all of the information for any purpose? Or will you need to get separate consent for each possible use of the data? Will you need to obtain renewed consent on a regular schedule?

And then there’s the device itself. Will personal data be stored on the device? If so, can it be erased or deleted remotely, or will it require a medical procedure? Medical device risks are alarming, but they can be mitigated.


Early in the development process, brainstorm as many privacy scenarios as you can come up with, and develop a policy for each of them.


Will the device need regular maintenance? If so, will it require the patient’s participation? In that case, you’ll need to get that consent before the device is ever ingested.

And what if it malfunctions? Can it stay inside the patient forever, or can it cause harm? If it does cause harm, who is liable? And who is responsible for retrieving the device and treating any damage it caused?


Meet with your legal team to discuss liability and risk before you invest too much money in a device that may be too risky to use.

Agency and Autonomy

What happens if the ingested device reports that the patient is not taking medication as prescribed? Does it trigger a phone call from the doctor or pharmacy, possibly counseling the patient on how important it is to take the medication on the right schedule?

What if the patient still doesn’t comply? Current competency laws would probably apply to issues of whether a patient can refuse treatment, but could insurance companies use proof of noncompliance to withhold coverage, or even to deny payment on life insurance policies?


“Who decides whether to force a patient to undergo unwanted treatment?”

“Who decides whether to report this information to insurance companies?”

These are non-trivial questions. In fact, they’re mind-boggling questions to discuss in detail with your lawyers at the very beginning of the development process, not when you’ve already got millions of dollars invested.

The Problem of Unintended Use

Now that we’ve covered the technical and legal issues, it’s time to fire up your imagination. This is the point where you try to think of every possible way people could use—or misuse—your device, decide what your position should be, and write a digital policy to address it. Because if there’s one thing that history has shown us, it’s that consumers are ingenious when it comes to using products in ways their developers never intended.

  • Long before DIYers began using it to loosen stubborn bolts and hinges, WD-40 was designed to keep standing water from causing corrosion on nuclear missiles.
  • Play-Doh was invented to clean wallpaper. Legend has it that some Cincinnati preschoolers found a stash and used it to make Christmas ornaments.
  • Rogaine, the go-to answer to hair loss, was originally used as a treatment for high blood pressure.

Unfortunately, not all consumer adaptations are quite so benevolent, as evidenced by product warning labels, like the label on a Dremel tool proclaiming that it’s “not to be used as a dental drill.” Or the Superman costume that warns it doesn’t actually endow the wearer with superpowers or the ability to fly.

When it comes to ingestibles, however, concerns about unintended use are more about potential abuse of power. For example, once upon a time, police departments installed breathalyzers in the cars of people convicted of drunk driving. They had to breathe into it and get a passing reading before the car would start.

Ingestible IoT devices could be used the same way. Police departments could give offenders the choice of losing their license or ingesting a device that would monitor their blood alcohol level. If the level were too high, the device could alert police and/or disable the car. Whether or not you would want your device used that way merits some serious discussion—and again, it should take place early in the development process.

A few more examples to consider:

  • Businesses requiring the ingestibles as a condition of employment, with the intention of maintaining a drug-free workplace
  • Parents making them a condition of letting their teens get their driver’s licenses
  • Insurance companies offering incentives to people willing to be monitored

Or—and don’t roll your eyes, because you know it will happen—a company using the devices for branding, beaming their messages from within a person’s body. Would consenting to the branding be a prerequisite for obtaining treatment?

Conclusion: Innovate with Care

On that note, I’ll let you take your burst bubbles and go back to your drawing boards. But it’s not my intention to discourage innovation in healthcare IoT. Far from it: I can’t wait to see what all of the ingenious entrepreneurs out there come up with, and how your inventions will improve healthcare for all. I just encourage you to take the time now to ask yourself all of those “What could go wrong?” questions. I know it’s not fun, but it would be a lot worse a few million dollars down the road!

Written by Kristina Podnar, Digital Policy Consultant at NativeTrust Consulting, LLC