“[The law] makes assumptions that folks have the time to understand what they are told; folks have the capability to analyze the consequences both short term and long term; folks have the strength (including economic independence) to negotiate the terms of the exchange; folks have alternatives etc.”Malcolm Crompton, ex-privacy Commissioner, Australia
Sometimes, when you get into a discussion on the topic of privacy you’re presented with “I have nothing to hide, so why should I care?” This seems to be a peremptory argument. However, this stock answer should instead open the door to understanding why even those of us who are model citizens, should be involved in privacy decisions.
The Philosophy of Privacy and Consent in the Modern World
As I thought about how privacy will fit into our smart cities, I looked back at who we are as human beings. Privacy, it seems, is an innate and deeply felt human need. The urge to have private places can be seen across cultures and across time. In a previous post, I looked at “The Origins of Smart City Privacy” through the lens of archaeology. Signs of both personal and group privacy could be seen in even the earliest of cities.
Privacy is a nuanced ideology. It can take many forms. And, out of the cities of old, our new smart cities are developing. This development will draw on the need for privacy, but it must also recognize that our instinct for private spaces should extend to an instinct for private data.
As we approach the year 2020, digital data has inexorably linked the digital and the real. The human-computer interface of old is now so fuzzy and hyper-connected, that it’s difficult at times to see where the real world me ends and the digital one begins. Our data, whether biometric, DNA, personal like name and address or metadata such as IP address or location, even points of view, exist across the boundaries of real and digital. The controls we place on these data will define our privacy in the digital realm, the impact of which will spill into the real world.
Privacy isn’t about hiding data. Privacy is about controlling data. If you don’t have control over your data, you cannot ensure that you have privacy. But choice is a grey area. The example of “consent” is one such nuanced and confusing part of digital privacy.
“Consent, in its purest form, could easily become a dystopian stick to control citizens with.”
When we look at what privacy should be, we need to recognize the hidden problems that lie within. Choice and consent are two sides of a coin that can easily flip. For example, Facebook and Google were hauled over the compliance coals because they offered consent with no choice. If you chose not to consent to share certain data, you were denied the use of the service. Some might argue, it’s up to Facebook, et al., to choose how and why they allow service use. Max Schrems believed that this lack of granularity in consent was in direct contravention of the EU’s privacy law, the General Data Protection Regulation (GDPR). Schrems legal action group Noyb, have ongoing “Forced Consent Projects”, and so far Google has been fined $50 million euros by the French supervisory authority (CNIL) because of consent violations. Consent can be a force for control but one that works both ways.
As we share our data with smart city builders, we need to understand the complexities within the spectrum of consent. One such issue is the creation of “privacy tiers”. Consent could easily become a commodity along with the data it hopes to protect. A free service like Facebook may end up offering a “free” tier for the sharing of data or, if you can pay, you get to keep control over your data; consent, under those circumstances, could easily become the Achilles Heel of technology.
The Erosion of Digital Privacy
I could argue that our digital privacy has been less “eroded” and instead more “only just realized.”
As the computer became more widely used in the 80s and 90s, privacy considerations were much less discussed than they are today. In many ways, privacy was barely equated with computing. There’s a reason for this. Back in 1990, if you had access to a computer, it was within a corporate network or in your own home; a floppy disk was the nearest thing to “connectivity.”
In 1990, I used an IBM compatible PC. The Internet was in the form of “bulletin boards” or BBS, which you connected to using a phone line connected to your computer (the old dial-up); CompuServe offered an early BBS. If you misdialled the number, you were often met by an eerie disembodied voice. Few used these boards outside of the techie world. The boards would offer file sharing, forums and places to offer programming advice and so on. I would regularly share personal information, never really thinking it would be misused. Perhaps this was naivety? Privacy was on the agenda back then but in a quieter way.
In a 1985 article for Whole Earth Review, a Yale student in Artificial Intelligence, Larry Hunter, said:
“Without any conspiratorial snooping or Big Brother antics, we may find our actions, our lifestyles and even our beliefs under increasing public scrutiny as we move into the information age.”
How true those words echo as we enter the era of surveillance capitalism.
We find ourselves in the final months of 2019 in a world where our data has become a commodity. Big tech consumes data like a hungry animal.
Our data already exists as a massively inter-connected entity. Once this data is fed into the expanding smart cities across the world, we will create a Digital Panopticon.
Digital or physical, the use of surveillance is nothing new. Human nature has control as a central aspect of our nature and, to manage the worst parts of that, we need to understand the basis of this.
As smart cities begin in earnest, we must stand back and look at where we are in terms of data privacy. What kind of structures can manage the collection, aggregation, analysis, and sharing of data, en masse? How can we ensure those structures reflect how human beings understand and crave privacy? Can these structures be used for both personal and group privacy, in accordance with the types of privacy expected by our forebears?
In my next article, I’ll look at why data privacy matters and why “I have nothing to hide” isn’t the answer.