We call it the Internet of Things (IoT), but what we often really mean is the Internet of Personal Data. If data is the new oil, then personal data is the lubricant of IoT. Internet-connected devices are awash with sensitive information. And in the age of hyper-connectivity, we are feeling the brunt of the inexorable connection between data and device in the form of privacy violations. This article will describe the issues involved in privacy and IoT.
When Privacy Goes Wrong
In the last few years, data privacy has had a shiny makeover, put on its heels and swanky black dress, and entered the mainstream media ball. Data privacy is no longer only talked about in dusty conferences frequented by specialist lawyers; no, data privacy is here to stay and regulations like General Data Protection Regulation (GDPR) are being updated to reflect this.
It’s all Snowden’s fault, of course. He opened the surveillance “can o’ worms.” But his was but a whisper compared to the outrage caused when Facebook and Cambridge Analytica so flippantly disregarded our personal data privacy. It’s in the wake of this heightened awareness of data privacy issues that we look at some of the IoT-based privacy violations of recent times. Privacy is touching us all. It isn’t just a personal issue; it’s also entering the boardroom. Here are five trending reasons to hold onto your data:
Alexa: A Witness For The Prosecution
What if evidence were collected by IoT devices? What would be the implications for judicial processes? In 2015, James Bates of Arkansas, US, was accused of murdering his friend who had been found dead in Bates’ hot tub. The prosecutor built the case around the data held on Bates’ Amazon Echo and his smart meter. Amazon refused to release the data collected by Alexa. The case could have stopped there. However, Bates gave permission for the data to be used during the case. The case was dismissed in December 2017, but the story hit the news and the defendant’s personal life was brought into the public domain. The saying “no smoke without fire” was undoubtedly especially meaningful to Bates during that time.
In another (still ongoing) case involving a Connecticut woman who was murdered in 2015, FitBit data has come under the spotlight. Prosecutors are basing the case on the woman’s GPS-related data. The data has helped identify her last movements. It placed her husband in the frame.
“Creepy Tech” and IoT
IoT has opened up a lot of new ways to interface with users, making privacy and IoT an issue for consumers. One such interface cuts across the visible spectrum (e.g. cameras)—and we’re an image-hungry species. Facebook, for example, has 147,000 photos uploaded per minute. But there’s something about the watchful eyes of a digital assistant that’s creeping many of us out. Many consumer IoT products come with a camera. Vulnerabilities can leave that camera open to abuse.
Recently, researchers at PenTestPartners located a serious flaw in a Swann IoT video camera that allowed a hacker to view video footage from another user’s camera. The hack was really simple: by adding a serial number of the camera into an app, you could view live coverage of that camera (the serial numbers are easily accessible). Thankfully, Swann fixed the issue very quickly. But camera security flaws have plagued consumer IoT devices since their advent. Possibly the most sinister of hacks is when baby monitors are targeted. In 2015, Rapid7 failed 8 out of 10 baby monitors for security compliance. Moreover, privacy concerns still plague monitors today. A recent case in which a U.S. mother found her FREDI baby monitor panning across the room and pointing at the spot where she breastfed her baby.
How Are IoT Manufacturers Affected?
It’s likely that IoT devices will be used in more court cases. The data IoT devices collect constructs daily “data journals” of individuals and organizations. Manufacturers may find themselves in the middle between the data owner and the justice system.
Cameras in IoT products offer important visual functionality. Many vulnerabilities found in consumer IoT products are based on issues and resolutions that are well-known in the cybersecurity world. Flaws such as unencrypted communication channels and programming interfaces (APIs) allow interception and hijacking of cameras. Other flaws, such as having an easy to guess administration password or device identifier, can also be easily fixed.
Abusive Surveillance With IoT
When we think of surveillance, we generally think of the government spying on citizens. However, the issue with IoT surveillance may be closer to home for many folks. A study by University College London (UCL) into the use of technology in domestic abuse found that technology can provide the “means to facilitate psychological, physical, sexual, economic, and emotional abuse as well as controlling and coercive
The UCL report considers how abusive individuals can use IoT technology, in particular, as a means to control others. However, more parties than UCL are concerned with the harmful potentials of new, somewhat unstable technologies. eSafety Women is an Australian project that teaches women how to stay safe around technology. As IoT begins to take hold in our homes, the opportunities to use the devices as a tool for spying and abusive control increases.
Manufacturers can help to make sure that there are mechanisms in place to prevent this. This isn’t easy, but certain measures can be used. For example, systems that have delegated access need to be designed with abusive users in mind. Data auditing can also offer the potential for abusive behavior tracking, however, auditing also has privacy implications. Police forces should also be educated in the potential for IoT devices to be abused.
A Perfect Storm: Health Data and IoT
Kaspersky has identified that smart device attacks increased by three-fold in 2018. Couple this with analysis by the Ponemon Institute and IBM, which shows that health data is the most targeted by cybercriminals. And there you have it: a perfect storm for damaging data exposure. As more of our highly sensitive health data resides on an ever-expanding security matrix, the likelihood is that the privacy of patient data is at risk. This was nicely demonstrated in Singapore with an attack on SingHealth, which exposed the data of 1.5 million patients—including DNA repositories.
It’s expected that 87% of healthcare organizations will incorporate IoT devices in some form into their operations by 2019. It is necessary to include privacy and IoT in the same discussion.
Services using healthcare IOT devices are often under strict regulatory control, such as HIPAA and GDPR, to ensure patient data is safe. Manufacturers need to ensure that correct security measures can be used to secure data against exposure.
Smart Privacy, Smart Grid
The smart grid offers an opportunity to optimize the use of energy consumption. However, some concerns have been raised over the privacy of smart grids and the smart meters they rely on. Behavioral privacy is the big issue with smart meters. The Electronic Privacy Information Center (EPIC) is big on consumer profiling and behavioral privacy. EPIC has listed 14 areas where smart meter use can expose privacy gaps. These include tracking the behavior of renters/leasers and identity theft. Notably, California has a “smart meter” privacy law (Assembly Bill No. 1274), which defines best practices for smart meters to protect users privacy.
EPIC suggest that user-centric control over the “collection, use, reuse, and sharing of personal information” should be built into smart meters. Anonymization of the data should also be a design remit.
A Shared Future For IoT and Privacy
The data privacy genie is well and truly out of the Internet-connected bottle. As consumers of IoT devices, we must all be aware of how our privacy becomes compromised through technology. As manufacturers of such products, however, there are two drivers to which we should adhere to ensure good privacy practice.
The Specter of Compliance
Regulations like GDPR are tightening the belt of data privacy. Others that are industry-specific, such as HIPAA, and location specific, such as the California Consumer Privacy Act (CCPA), are baking data privacy into law.
Privacy = Trust