Human Rights and IoT: The Right to Privacy

Privacy is a human rights issue. Technological advances like IoT and AI, which depend on the connection of just about everything, have major privacy concerns. But it isn't doom all the way down. Used responsibly, these technologies could even support the right to privacy.

Hannah Sloan
Image of a computer with eyes, indicating that someone is watching the user through the computer

IoT and privacy have been described as incompatible—ideas crafted from different times with different goals. Connection vs. separation. Agility vs. protection.

As I mentioned in the first post in this series on IoT and human rights, there are plenty of reasons to believe that emerging technologies (IoT, AI, and automation more broadly) conflict with larger policy goals. There are inevitable risks in their adoption and unknowable challenges that come with these cutting-edge innovations. They will disrupt and expose. Things will break. Jobs will be eliminated. It’s not always clear how tech like IoT can be made to serve some “common good” in its immediate implementation, even though it’s designed to increase efficiency and broaden technological horizons.

Privacy in the Digital Age

I wrote before that human rights provide a helpful framework for measuring the predicted effects of IoT adoption. This is because human rights present a generalizable global consensus about the conditions for human flourishing—goals that guide legal systems, government policies, NGO work, etc. Rights endowed to people by virtue of their humanity take on utmost importance; their violation is a denial of opportunity for core human wellbeing.

Privacy is one such right. Article 12 of the 1948 Universal Declaration of Human Rights states the following:

“No one should be subjected to arbitrary interference with his [or her] privacy, family, home or correspondence, nor to attacks on his [or her] honour or reputation. Everyone has the right to the protection of the law against such interferences or attacks.”

Since this document was written sixty years ago, technology has skyrocketed—literally. While Article 12 (above) clearly aims to protect against invasions of privacy like breaking into someone’s home and stealing their mail, we need to evolve the idea of a right to privacy in this digital age. The storing and sharing of personal information, for example, is a process worthy of serious consideration. It underpins other foundational rights such as freedom of speech and freedom of association.

Let’s explore how IoT interacts with this seemingly aspirational “right to privacy.”

Image Credit: MIT Technology Review

IoT and Privacy Infringements

The first laws governing the use of personal information came out of Europe in the 1970s when it became clear that new forms of communication needed new forms of protection. Since then, the UN has stated, “the rights held by people offline must also be protected online.” In 2013, the UN adopted a resolution to reaffirm and outline the right to privacy in a digital age, which calls upon governments to be transparent and proactive in how they handle two key privacy vulnerabilities: surveillance and misuse of personal data.

1. Surveillance and Policing

The UN has stated that surveillance “inhibits the free functioning of a vibrant civil society”­—or rather, that the vulnerability of digital communications to surveillance does so. If IoT systems are vulnerable to (or subject to) surveillance, there are a host of human rights violations that can occur.

On one hand, surveillance can enable corrupt regimes (or corporations) to exercise power more efficiently and more effectively. The dangers of government surveillance are widely discussed. The dangers of non-government surveillance are just as alarming. Surveillance can empower individuals or governments monitor and disarm political dissent. It can concentrate knowledge in the hands of the powerful few and allow commercial entities to influence consumer behavior. It can rub up against the limits of police power, particularly in urban areas. It can allow for restrictions on freedom of movement within state borders, a right guaranteed in Article 13 of the UDHR. Spyware, as in extreme cases like the Syrian Civil War, can be a very powerful weapon on any front.

You could write a book about all the ways IoT (and other information communication technologies) can be abused by political and commercial actors. What’s more important is how it will most likely be adopted and used by these entities. Citizen surveillance and security monitoring are already in place in many areas of our world. IoT will only make these capabilities more advanced and more ubiquitous.

An image portraying a human holding up a lock icon, demonstrating the privacy risks of IoT.
Image Credit: Matomo 

2. Misuse of Personal Data

Exposing personal data can be catastrophic. However, even if it isn’t catastrophic, it’s still a violation of rights. Personal data can be gathered digitally, biometrically, genetically, and via video and other media. The kind of data gathered, and what is done with it, affects more than the right to be protected against “arbitrary interference with … [one’s] correspondence.” It bleeds into other foundational rights protected by the UDHR, such as…

  • The right to a fair trial, if communication between a defendant and their lawyer is exposed, for example.
  • The right to freedom of assembly and freedom of association, which could be limited if details about one’s beliefs and identities are accessible via the Internet and monitored by external parties.
  • The right to freedom from discrimination, if associated or proxy data (e.g. algorithms that profile individuals, voting behavior, religion) is collected and used to block access to information or opportunities.
  • The right to freedom of expression, if (by processes like surveillance) it is known or expected that personal expression will be monitored. Thus, expression will be modified to influence conformist behavior.

As we all may be well aware, there are also subversive effects of “personalized” technologies. The Partnership on AI writes the following: “While technologies that personalize information and that assist people with recommendations can provide people with valuable assistance, they [can] also inadvertently or deliberately manipulate people and influence opinions.”

Influencing consumer behavior is not a rights violation per se, but it can erode the systems that keep other rights protections in place. Even though human rights are mostly individual rights, they require collective systems (like privacy and data use standards) to be upheld in order to secure protections for all.

IoT Can Increase Privacy Protections

Encryption and other security measures found within IoT systems can help promote and protect the right to privacy in many cases. Encryption reorganizes information into an unreadable format, accessible only by an encryption key. Robust encryption can ensure that certain messages and thus certain personal data are safe from prying eyes. In many cases, this can help support the right to freedom of expression, association, and assembly—especially among vulnerable people groups. Oppositional political groups, and those fleeing persecution or abuse, for example, can make critical use of these avenues for privacy and safety.

Image Credit: Marcus Spiske

However, encryption can presumably disguise riots and other public safety concerns as well. Protecting people’s privacy doesn’t always mean protecting the security of all people. Crime is harder to fight when information is not as accessible to the police. It’s often hard to figure out what kind of personal data is covered within a “right to privacy,” and what data is public domain. However, as I hope I’ve demonstrated, it’s worth making that distinction.

Without a sphere of privacy, all information—however intrusive or misleading—is fair game for public and corporate analysis.

Public safety and national security are the upsides to the myriad concerns about privacy infringements and IoT. With more data being collected and transformed within IoT systems, safety and security can (in theory) be more well-monitored by those who have the data. Privacy can then be protected by the processes used to secure data. For example, companies like Protenus are ensuring health systems remain HIPAA compliant using artificial intelligence. IoT tools that can be used to surveil (and thus violate the privacy of citizens) can be used to protect the privacy of a home or workplace. IoT can allow almost anything to communicate, which can be used to protect privacy rather than violate it—it all hinges on data ownership and verifiable standards of use.

Privacy and Security Must Work in Tandem

IoT could elevate new voices, expose wrongdoings, and make our world safer and more transparent. At the same time, communication between all things doesn’t serve all people.

Privacy and security are often seen in opposition to one another, but here it is clear that with an increase in IoT deployments, both must be prioritized.

System privacy and security often relate directly to the privacy and security of people and/or assets. Data access and data control remain the key drivers of human rights protections and violations in this arena. Vulnerabilities within an IoT system can directly lead to insecurity and lack of privacy because new hands can access and/or control data.

Privacy may be protected for some groups and not others, which allows for further human rights violations to occur. Thus, privacy standards must be introduced to ensure that the right to privacy is protected equally across the board. Otherwise, it will be manipulated or falsely prioritized against other important claims like the right to life, freedom from torture, the right to social protection for children, etc.

The prioritization of rights is a topic for an entirely different post, but it’s clear that protecting the right to privacy requires stringent protocol transparency, system security, government cooperation, and—perhaps ironically—regulation of data protection. As the world becomes increasingly connected, the right to privacy must be continuously asserted and thoughtfully protected.

The next post in the series Human Rights and IoT will focus on IoT’s role in aggregating information on human rights abuses and responding to disasters proactively. 

Author
Hannah Sloan
Hannah Sloan - Staff Writer, IoT For All
Hannah is a Staff Writer at IoT For All and a 2018 Venture for America Fellow. She studied Economics at UNC-Chapel Hill, and on a good day, you can find her hiking, drawing, and thinking about the sociopolitical implications of IoT adoption.
Hannah is a Staff Writer at IoT For All and a 2018 Venture for America Fellow. She studied Economics at UNC-Chapel Hill, and on a good day, you can find her hiking, drawing, and thinking about the sociopolitical implications of IoT adoption.