In a speech at last year’s conference in Berlin, software designer and privacy advocate Aral Balkan posed a provocative question – “If slavery is the business of buying and selling physical human bodies, what do you call the business of selling everything else about a person that makes them who they are apart from their physical body?”
Last week, CNBC reported that a large social media network (the same one that’s being questioned by the U.S. Congress), was in talks with top hospitals and other medical groups as recently as last month surrounding a proposal to share social network data of their most vulnerable patients.
The project did not progress beyond the planning phase, but this is getting downright scary. When does sharing, matching, profiling, etc, go from being a plus to a straight invasion of one’s privacy?
The digital cloning of ourselves is happening on a daily basis – minute by minute. Anybody today who accesses the Internet is having copies made of their digital identity.
Advertisers record every click and subsequently track our activity the web. Data brokerage companies then assemble this information and make the invisible, visible and accessible to anybody and everybody who is willing to pay.
Of course, steering consumer behavior in the direction of profitability has always been the goal of advertisers and corporations. This is normal, as it should be. What has changed however is the explosive availability of personal data from a variety of sources which is incentivizing manipulation on a large scale.
Our digital reflections are being bought, sold, exploited and experimented on. The more robust our data selves become, the more effective one can understand and manipulate them.
Mathematician Cathy O’Neill, author of Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, writes that companies use data to direct people to certain goods/services, and offer prices based on how much they think an individual can pay. O’Neil and other researchers also describe the use of secret algorithms to profile and sort individuals into groups based on weaknesses and vulnerabilities identified by their online activities.
On May 25, 2018, The General Data Protection Regulation (GDPR) will take effect, seeking to strengthen and harmonize the law across all EU nations and create new individual privacy rights which extend its application to companies that process personal data surrounding EU individuals, even when a company is not located in the EU. The new privacy rights include the “right to be forgotten” (having one’s personal data erased in certain situations), data access and rectification (access to or a copy of one’s own personal data and correction of inaccuracies) as well as data portability (the transfer of one’s personal data from one company or platform to another).
With violation penalties of up to 4 percent of global gross revenues in some instances, the GDPR is being taken seriously by U.S. companies alike. As it turns out, the policies and procedures these companies are implementing to protect their European customers and employees can also benefit Americans as well.
As a result, a no-go area is the use of 3rd party data for marketing campaigns. Buying data from companies that harvest email addresses (for which consumers give their consent without knowing because it is hidden in the small print) is a very dangerous act in the context of GDPR.
And how about profiling? Well, one can do this if automated profiling does not result in discriminatory or exclusionary practices.
But with this law, is the EU potentially hitting the brakes on innovation and thus punishing businesses for situations beyond their control? The privacy of the individual is hugely important for the EU, especially when set against intrusive big corporations and governments. The EU wants to give individuals access to the data that companies store about them, yet the same EU is not fully transparent on its own operations, for example, how lobbyists try and shape policy in their favor.
The EU is a hugely bureaucratic organization that regulates business across its 28-member states. It also holds vast swathes of data surrounding approximately 510 million citizens and the businesses that operate within the region. But legislation is here and as such must be complied with.
Recently, social media attacks and use of personal data were threatened by sophisticated cybercriminals. To help protect citizen data against such attacks the EU is now regulating, but can one state that GDPR has little to do with cybersecurity and putting the cybercriminals behind bars?
A peculiarity of GDPR is data processors (outsourced cloud providers for example) will not be held responsible if they lose data that is controlled by their client. This means they have less incentive to increase security.
GDPR might slow down data-driven innovation because companies will hold back on accessing or storing data for fear of prosecution.
We do not yet know what will happen. Will GDPR damage new job creation?
There is no doubt that data is transforming our lives, but this is taking place in an environment of rapid technological change, and so decisions on how our data can be used also have implications for our future.
If we hope to preserve our democracy and autonomy, we must recognize what we call “human digital rights.” The same as “human physical rights,” our digital footprint is also human and onto ourselves. At least this is what we believe in our human hearts at Geme.io.