In the current business paradigm, structured by big tech firms over a decade ago and replicated since by a number of online platforms, individuals willingly provide their personal information in exchange for a service. Personal data is subsequently repackaged—anonymized or not—and sold to advertisers and marketers.
Some form of equilibrium establishes itself—in theory—between convenience and value for the users, to whom the services appear to be free, and financial rewards for the platforms, which monetize the personal data collected to generate revenues. In reality, the balance has been tilted over time in colossal proportions toward tech firms, whose valuations and profits have soared to historic heights as a result.Data is today’s most used currency. Do you own your personal data? Could you sell it instead of giving it away? How can trust be built in that type of transaction? || #IoTForAll #IoT Click To Tweet
Yet, it seems that this model may be reaching a crisis point. The Pew Research Center recently reported that 64% of Americans have personally experienced a major data breach. Beyond technical and financial considerations, this also impacts confidence negatively, as highlighted by the Axios-Harris Poll 100, suggesting that Americans consider data privacy as the top priority companies should address.
This follows a record number of data breaches reported since 2016, in particular in the US. Following scandals such as the one surrounding Facebook and Cambridge Analytica, people have become increasingly aware of their online vulnerability, and distrust for large platforms is setting in more widely.
The unavoidable rise of the Internet of Things (IoT) will only make the issue more complex, as increasingly more intrusive and personal data will start to be collected about each of us—sometimes unbeknownst to us. Smart speakers, for example, bring intensive, real-time data collection to the comfort of our own living rooms. This poses new challenges around the issue of consent and privacy: Why would you agree to have your conversation with your friends recorded simply because they own a certain type of smart assistant device?
The EU General Data Protection Regulation (GDPR) was the first step in ensuring individuals’ hold over their data. Yet, it remains a European initiative, and it is still too early to draw conclusions on its actual impact.
In some industries, the quality expectations around the data required are so high that organizations willingly pay for it. This is the case, for example, in academic research in which field experiments often require highly precise data. Researchers usually conduct deeply informative surveys on a wide number of participants (to obtain representative samples) who are remunerated for their answers.
If big tech firms were to move toward similar business models where they pay their users for their personal data, could it be a step toward more transparency and the rebuilding of trust? (This could be a key factor for their long-term success, and the digital transformation of society at large, free of excessive regulation.)
The question around data monetization should be at the heart of the evolution of the tech industry for the next decade, but how can it be fair for all, transparent and work at scale?
A first approach to this question could be a top-down valuation of personal data established by online platforms. For example, economists drafted an experiment to value the free services, currently subsidized by the trading of personal data, of some web platforms. Participants on average responded that they were willing to leave Facebook for $110 and Whatsapp for $600, thus valuing the personal data they deliberately provide to those platforms at those prices.
Such a model would require tougher enforcement of cybersecurity measures to guarantee that data usage is —and remains—as authorized by the user. As platforms would now be paying for our data, chances are they would be able to use it but not possess it. In the case of data breaches, they would incur a greater liability with respect to their users who would own and “lend” their data.
An alternative to this top-down model by and for large platforms is nonetheless possible. Tokenization—the process of replacing and valuing data with a secured identification symbol—offers such a perspective. If tokens were to be backed by personal data and decentralized into a public blockchain, a new, bottom-up, market mechanism could emerge.
Cryptocurrencies might be a channel through which individuals value data. Large platforms needing to access and crawl data, such as Facebook or Google, would purchase the token corresponding to personal data directly to their miners (their users) in a secure way. In turn, the aggregation of tokens sold by individuals and purchased by firms would generate a new market, thereby establishing a natural valuation of available tokens.
Such a mechanism offers interesting perspectives as users would easily track their personal data value but also actively engage in the data market over which they would be able to share directly the growing benefits.
Yet, answering the question of scalability isn’t that easy. The boom and bust witnessed by the ICO market in 2018 proved that the relative convenience and liquidity of tokens can easily lead to large speculative trends. While trading with cryptocurrencies, which did not back highly tangible products or services, had no real impact besides financial losses and ecological damages, a similar situation with our personal data would present very high risks for society.
The scalability of such a project thus presents as many hopes as risks, which shall be further investigated before diving into a new paradigm.