Data Privacy and Necessity of Governance Policies
Data, in most cases, must be a shared resource. Ironically, being a shared resource is what gives it its immense ability and makes the breach of privacy an issue – at the same time.
So, policies surrounding shared data systems become an absolute necessity. That’s what provokes the need for data trusts. A data trust is a legal structure that provides independent stewardship of data. That probably sounds vague and confusing, so let’s go back a bit.
Let’s start with privacy. More specifically, privacy as a collective entity in our society. The more common and popular conception around this is: If every individual makes the right decisions and protects their own privacy, it’ll somewhat lead to secure data protection for society as a whole. Unfortunately, nothing can be further from the truth.
Let’s take this covid situation for example. We know that if every individual wears masks, disinfects their everyday tools, and maintains health regulations, covid might be eradicated. But you cannot expect everybody to act responsibly on the same level. People don’t tend to abide by the laws, either because of lack of knowledge or carelessness unless they are strictly enforced and a good overseeing group maintaining the rules.
Some Case Studies
Where details about handling privacy and regulation laws are foggy, Murphy’s Law tends to take over. It is another way of saying, whatever can go wrong, will go wrong. And that’s what happened in the following events we’re about to mention.
The watergate scandal
The infamous watergate scandal, which resulted in the resignation of President Richard Nixon. In the event of the 1972 break-in and burglary at the Watergate office building in Washington DC, five men were arrested. During their trial, Witnesses testified that the president had approved plans to cover up administration involvement in the break-in and a voice-activated taping system in the Oval Office. Nixon did everything in his power to bar these tapes from being exposed.
Several processes later, The U.S. Supreme Court ruled that Nixon had to release the Oval Office tapes to government investigators. The tapes revealed that Nixon had conspired to cover up activities that took place after the break-in and had attempted to use federal officials to deflect the investigation. With his complicity in the cover-up made public and his political support completely eroded, Nixon resigned from office on August 9, 1974.
Cambridge Analytica Ltd (CA) was a British political consulting firm that came to prominence through the Facebook–Cambridge Analytica data scandal. They obtained millions of Facebook user’s personal data without their consent. How they achieved that, included: If one person agreed to share their personal data, it would reveal to them their whole network of Facebook friends, and from there, they can expand to fetch those friend’s data. The data was reported to be heavily misused and obtained with unfair motives and activities, predominantly for political advertisement. The company closed operations in 2018 due to the public enragement it caused.
One of the most recent rumbles regarding data privacy probably is the endeavor sidewalks labs wanted to initiate in Toronto, Canada. In June of 2019, the project was heralded as “a neighborhood built from the internet up – the most innovative district in the world.” That elaborated into a plan of building a beautiful, eco-friendly, and technologically advanced neighborhood that sounded like a utopia. So what was the catch?
Well, of course, the problem is privacy breaches through data collection. If the Sidewalk Toronto project had come to fruition, it would have installed occupancy sensors into every home in the community to adjust temperature and minimize energy use throughout the day. It would have established an expansive network of cameras and used AI to analyze traffic patterns, monitor traffic speed, and predict collisions. Even the streets would collect data and respond accordingly; “smart” roadways would have used LED lights to dynamically change lane width to accommodate usage by different types of commuters. The high-tech plan drew criticism, mainly over data privacy issues regarding the robust data collection in the proposed community. And for that reason, sidewalk labs had to step down from the project.
Coming back to the problems with common data-sharing systems:
The Decision fatigue and power asymmetry
Yes, it’s about those endless parades of a hundred companies asking us to say “yes” to their privacy policies. We may check out the policies or go through the first two encounters we ever witness, and then go clicking “yes” every time anyway because first, we got bored/tired of the details and our brains formulating the thought that it ain’t such a bad thing. Secondly, In most cases, we don’t have any other options. I mean, we are either to click yes or to reject having the services.
Let’s say you did go through a thorough check-up on the policy, and you go: “Hey, wait a second. In this part, article 4, subsection 9, I do not agree to give them this particular data.” What can you do? You can send the company an email about this. However, you’re only one individual. Maybe a million others have complied unknowingly, or they had no problem. There’s a tremendous power asymmetry between you and the company serving millions. So indeed, they’re very likely to ignore your objection.
Negative Externalities in data sharing
We’re Circling back to the “privacy and sharing data” issue again. You may think sharing your private data affects you, and you alone. But that is not the case at all. For example, if you chose to give a biology organization your DNA information for research, you’re not only giving out your “own” DNA information. DNAs match significantly within a family. That means what you’re essentially doing is you’re exposing a lot of essential data on your family and your surroundings along with you. One or two of those people may not be comfortable sharing that info about themselves, fearing they might use the data for immoral purposes. On the other hand, if you refuse to give data, you might be halting scientific progress. Quite a double-edged sword, isn’t it?
Another example is your data on Facebook. You share it with a third party. Meaning they probably get the information about your friend list. And from that, they can go ahead and get their data too. Take the Cambridge Analytica example from above, for instance.
The takeaway from all this blabbering is, sharing even your personal data may expose others. And an authorizing entity must thoroughly review all consequences and externalities of the data sharing. So, to do all these, data trusts were set up.
Check out our next blog where we elaborated what a data trust is.
Contributor: Istiaq Bin Salam Siaam