#WorstEndingEver? Maybe

As people scurry to exit Facebook, soberly acknowledging they sort of knew about data usage, the idea of a digital endings starts to become a need. The irony is that the essence of the story around Cambridge Analytica isn’t one of theft, or any clearly stated act of crime. It wasn’t the ‘hack’, some papers have said. Or a ‘data breach’ that others have called it. The essence is a bad ending, left lingering, until it was abused. 

Facebook support thousands of research projects across the globe. A whole arm of the company is set up to do research into diverse areas of computer science, providing academics access to data is a common request. In fact, it is the primary function of the relationship in many cases. The research section of the company proudly state,

“They take on the most challenging problems in computer science and related fields, to push boundaries that impact millions of people every single day.” Well, that now has a colder, more sinister tone.

The work originally done by Dr Aleksandr Kogan in 2014, would have faded in the thousands of other research projects being run by Facebook. The original survey ‘This is Your Digital Life’, successfully collected 270,000 users data, and also grabbed some public data about their friends. Kogan then collated this alongside other information and gleaned an enormous data set of 50 million users.  None of which is a crime, direct mail and marketing companies commonly assemble data on an enormous scale with similar cross referencing techniques. The issue turned to crime when the academic study was dragged into the territory of politics and business. There it was left to be bastardised.  

In a recent interview with The Guardian, Christopher Wylie, a whistleblower from Cambridge Analytica, describes the slow escalation of interest. “Facebook could see it was happening. Their security protocols were triggered because Kogan’s apps were pulling enormous amounts of data, but apparently Kogan told them it was for academic use. So they were like, ‘Fine’.” 

Facebook became increasingly suspicious after another report, again from the Guardian, revealed how the Ted Cruz US Republican Candidate used Cambridge Analytica, and the Facebook data in his campaign. A clear line had been crossed in Facebook’s agreement. According to Wylie, it still took a couple for months for Facebook take action. They sent a letter to him, stating that “He was not authorised to share or sell the data and it must be deleted immediately.” Expanding on the letter, Wylie revealed “literally all I had to do was tick a box and sign it and send it back, and that was it. Facebook made zero effort to get the data back.” Further to this he admitted there were multiple copies of it and it had been emailed in unencrypted files.

Let’s just reflect on that. One of the biggest, most powerful, influential companies in tech, used the medium of paper to request an analogue identifier for an unverifiable request to delete a digital asset that they owned, and then asked for it to be returned by snail mail. #WorstEndingEver

For normal folk (regular Facebook data points), and for that matter users of most social media services, the existence of an unwanted photo or an embarrassing statement can be enormously traumatic. One in ten young women have been threatened with public posting of explicit images in US according to Data Society and the Centre for Innovative Public Health Research. 1 in 25 people have had it become a reality. 

If these victims placed content on Facebook themselves then they have the means to remove it within the interface of the social media platform. If it was uploaded by someone else - yet still within the borders of a social media company - there are other methods to delete it. But beyond that no one has control. Facebook’s terms and conditions express it like this “This IP Licence ends when you delete your IP content or your account, unless your content has been shared with others and they have not deleted it.” A statement that reflects Facebook’s own situation. 

It is hard to delete data. The benefits we gain from the internet, a system of endless duplicating potential, built with the intention of surviving a nuclear assault, is it’s inability to delete and end. Its memory is absolute, infinite and sharable. The majority of web applications and platforms support and build on these very characteristics. Events like the Cambridge Analytica case show humans how little we have achieved in designing balance in this system and methods to neutralise these powers in light of them being abused.

Sadly, many individuals have been exposed to the horrors of the internet’s bias in sharing and remembering. This incident will hopefully provide social networks a little empathy in to what its like to have no controllable end to your data and your reputation. 

Joe Macleod
Joe Macleod has been working in the mobile design space since 1998 and has been involved in a pretty diverse range of projects. At Nokia he developed some of the most streamlined packaging in the world, he created a hack team to disrupt the corporate drone of powerpoint, produced mobile services for pregnant women in Africa and pioneered lighting behavior for millions of phones. For the last four years he has been helping to build the amazing design team at ustwo, with over 100 people in London and around 180 globally, and successfully building education initiatives on the back of the IncludeDesign campaign which launched in 2013. He has been researching Closure Experiences and there impact on industry for over 15 years.
www.mrmacleod.com
Previous
Previous

GDPR and Ends. An overview

Next
Next

GDPR...aargh! Consent and consent removal. #2