fbpx

Could GDPR have saved the Facebook-Cambridge Analytica scandal?

Could GDPR have saved the Facebook-Cambridge Analytica scandal?
The Facebook Cambridge Analytica scandal put a much-needed spotlight on the fragility of personal data protection in the light of GDPR. Could the GDPR regulations have prevented this? Read our take.
Share This Post
Could GDPR have saved the Facebook-Cambridge Analytica scandal?

As we saw companies scramble for the past year or two to comply with the new GDPR data protection laws, it was clear that panic and frustration overtook many at these enterprises. Were the new laws just simply a means for more bureaucracy and overhead?

How do we even begin with a data governance plan?

In the midst of all the preparation, the world witnessed the Facebook-Cambridge Analytica scandal and how well-timed it was. The high profile case came just a few weeks before the May 25th, the looming GDPR implementation deadline.  The Facebook-Cambridge Analytica scandal put a much needed bright spotlight on the fragility of personal data protection and how important it is for consumers who are growing wearier and doubtful by the day of how their data is being used. The 87 million users whose profile information was mined for targeted political and other campaigns trusted Facebook to protect their personal data, but as Mark Zuckerberg himself admitted and later apologized for, that trust was breached.

GDPR & The Facebook-Cambridge Analytica Scandal

Legislation like the European Union’s General Data Protection Regulation (GDPR) is critical for keeping data owners accountable for how they use our personal data. So how would have Facebook fared if the Cambridge Analytica affair was investigated by GDPR enforcers? In this blog post, we’ll show you just how important it is to remain GDPR compliant and how critical the value enterprises must place on keeping their customers’ data secure.

Consent according to GDPR

In general, the GDPR acknowledges that different protection measures are required for different levels of data sensitivityand the same is true for consent. The more personal the data is and the longer it will be retained, the more explicit the consent must be, such as actively selecting an “I consent” option. In other cases, consent should be, at the very least, unambiguous, such as filling in an optional email address. Would GDPR enforcers have found that Facebook did an acceptable job getting consent from users to collect and store their personal data? And what about the personal survey app itself, which was downloaded by some 300,000 Facebook users? In 2014, when Cambridge University researcher Alexander Kogan launched the app, Facebook’s terms of service with third-party developers permitted harvesting profile data belonging to friends of consenting users who had not disabled the default permission to do so. The app’s terms of service stated that it would collect and use data about users and their friends. Thus, Cambridge Analytica could argue that when users accepted the terms of service, they explicitly consented to the collection and use of their personal data. While they may have been let off the hook regarding consent, both Facebook and Cambridge Analytica were non-compliant with another fundamental GDPR principle: privacy by design. This principle states that apps should be developed and configured so that by default, their users benefit from the highest level of data privacywithout any action required on the users’ part. In light of initial media coverage back in 2015 about Cambridge Analytica’s use of data in US political campaigns, Facebook amended its API terms of service and removed the open-ended privacy default setting in its own app.

Breach Notification

The GDPR is motivated by a strong concern that unlawfully exposed personal data could cause irreparable economic and/or social damage. It clearly stipulates that the data controller must notify supervisory authorities within 72 hours after becoming aware of a breach of personal data, among other important data protection pillars. Would GDPR enforcers have found that Facebook was diligent in notifying its users that their profile data had been breached? It’s possible that they would not have considered the Cambridge Analytica debacle an actual breach since the data was gathered lawfully from public sources and not by breaking into Facebook’s internal—and closely guarded—database of user information. Still, it’s interesting to look at how Facebook handled the matter at two different points in time. Cambridge Analytica’s egregious harvesting and use of Facebook users’ profile data first went public in 2015. Facebook’s initial response was to ask Cambridge Analytica to erase the data. Facebook also closed the loophole that Cambridge Analytica used for its indiscriminate data mining. The users whose profile data was harvested were not notified at that time. When the story went viral in March 2018, Facebook, as part of its damage control, notified all 87 million users who were affected. Although this action may not have been legally required, it was certainly in line with the GDPR ethos that data controllers must take their obligations to protect the privacy of personal data very seriously.

Clear Purpose

In response to years of apps gratuitously gathering users’ personal data, the GDPR clearly espouses the principle of data minimization. This means that data controllers can only collect and use personal data as appropriate for expressly stated objectives. For example, you can’t tell users that you are gathering their personal data to define their digital personality and then use it to send them targeted messages during political campaigns. Would GDPR enforcers have found that Facebook and the Cambridge Analytica app were upfront about why they were collecting personal data? And were they diligent about using the data purposefully? The answer would seem to be a resounding “no” when it comes to Cambridge Analytica. And Facebook would need to explain why it failed to follow up aggressively on whether or not Cambridge Analytica had actually erased the data when asked to do so in 2015.

The Bottom Line

Because Cambridge Analytica recently closed operations, it doesn’t really matter how it would have fared with the GDPR enforcement authorities. However, if the GDPR had already been in effect during the scandal, and Facebook was deemed non-compliant, it would have been fined 2% of its annual turnover ($814 million in 2017 terms). If Facebook was found guilty of an actual breach that endangered the freedoms and rights of the data subjects, it would have been fined 4% of its annual turnover.

The bottom line is that it is incumbent on every organization, large or small, to make sure it is compliant with GDPR and has laid the foundation for data governance implementation. We are already seeing the State of California take matters into their own hands with the recent CA AB 375, recently signed into law and includes similar-to-GDPR disclosure requirements, customer rights and penalty fees for noncompliance. It seems we can now raise a glass to a new era in which companies are held accountable for safeguarding our personal data.

Next step

The easier way to recover cloud workloads

Allowed us to save over $1 million in the management of AWS EBS snapshots...

N2WS vs AWS Backup

Why chose N2WS over AWS Backup? Find out the critical differences here.

N2WS in comparison to AWS Backup, offers a single console to manage backups across accounts or clouds. Here is a stylized screenshot of the N2WS dashboard.

Try N2WS for Free