Another day, another apology from Facebook CEO Mark Zuckerberg.
But Zuckerberg’s “sorrys” can only go so far because unless he decides to shut his shop (or fundamentally rethink how the platform makes money) the dark heart of the data exploitation scandal is not Cambridge Analytica’s duplicity, but Facebook’s own business model.
Because whatever Cambridge Analytica were up to, its process was a carbon copy of the social platform: That is, gathering as much data from Facebook users via a diverse range of digital engagements in order to target the same cohort (ie Facebook users) with messages and advertisements for its own gain.
To anyone out there who still thinks Facebook is a free service and all users have to do in return is tolerate a smattering of ads in their feed – here is your guide to the real world of Facebook financials.
As we have discussed previously on The Future This Week, Facebook’s entire business model is to micro target people in their network and to offer that service (to third parties) for a price.
Here’s how the model works
Cambridge Analytica is a good case study so let’s look at how that unfolded.
When Cambridge University academic Dr Alex Kogan asked users on Facebook to respond to his questionnaire, the app that facilitated their responses made use of Facebook’s Open Graph API. An API is an application programming interface, an interface that allows app developers to access data from a platform – such as Facebook – and then make use of that data through its own app.
This Facebook API allowed third party operators to access a huge range of user data, specifically the data lifted from the ‘About Me’ section in a Facebook users profile: This is a treasure trove of personal information including:
- actions, activities, your birthday, check-ins, history, events, games
- activity, groups, hometown, interests, lives, location, notes, online presence,
- photo and video tags, photos, questions, relationship details, relationships
- religion, politics, subscriptions, website and work history.
The kicker in this Facebook Open Graph API was that it allowed access not just to the data of the people who agreed to the do the quiz, but also to the data of all of the friends in their network. This was not an oversight on Facebook’s part but part of the terms of service that Facebook allowed at that time. This part of the model changed since but we’ll get there in a moment.
So while about 300,000 people agreed to undertake Dr Kogan’s ‘personality quiz’ he was able to scrape the ‘About Me’ and other Facebook information (such as likes) from an estimated 50 million Facebook users connected to the quiz takers. While Facebook might argue this was OK under their terms of service it is highly questionable if any of the users had read or understood what they were agreeing to – and their ‘friends’ certainly had not. But here Dr Kogan was not doing anything ‘wrong’. This permissive access was part of Facebook’s business model and its incentive structure: Third party app developers (remember Farmville?) got access to the users, and Facebook would take a clip along the way.
What Dr Kogan did next was ‘unauthorised’ according to Facebook’s terms and conditions – that is he handed over ‘his’ Facebook data set to Cambridge Analytica who then allegedly went on to use this information on behalf of Donald Trump’s campaign in the 2016 US Presidential elections. So when Facebook admits to failing to protect its users’ data this is the so-called ‘breach’ it is referring to. Zuckerberg has said (many times now) how sorry he is Facebook failed to protect its users’ data from the unauthorised use by Cambridge Analytica.
Facebook announced it would close this overly generous API in 2014 (with a grace period of 12 months). This week Zuckerberg promised Facebook will better monitor third parties who use Facebook data and will ban any developers who do not agree to a Facebook audit. Facebook have tried to make any overreach the ‘fault’ of the app developers – thereby positioning Facebook as the trusted gatekeeper who will reign-in any excesses on the part of the app developer community.
Whether Facebook shut down access to the permissive API for moral reasons, or because it realised how valuable the data was and that by sharing it with third party developers it was diminishing its value, we don’t know. What we do know is that the social network has since devised a new and elaborate model that would help Facebook utilise the ‘full value’ of its users’ data, for its own gain.
Over the last four years Facebook has ramped up not only its own data collection efforts but also its data analytics, allowing advertisers, or anyone else prepared to spend the money on its platform, to micro target messages to particular users and user groups.
So it was at the very time when Zuckerberg announced the closing of the Open Graph API that he launched the development of the Facebook Audience Network – “the Power of Facebook ads, off Facebook.” This allows advertisers to place Facebook-based ads on their own web sites. With the added Google assisted feature where people can use their Facebook credentials to log in to other services.
This ‘convenient’ feature allows Facebook to collect even more data off its platform. So wherever a Facebook user goes online, Facebook (and Facebook’s analytics) follow. All of which increases the amount of data Facebook has – supercharging the power of its analytics and the targeting of advertisers’ messaging.
In this way Facebook was the major enabler not only of Cambridge Analytica in 2016 when that company executed its targeted campaign on its platform during the US Presidential election, but it also allowed Russian actors to take advantage of the same service at the same time, also on the Facebook platform.
To quote from a Wired article on this topic:
“As the somewhat tired saying goes, if you’re getting something for free, you are the product. But Facebook’s misuse of user data goes far beyond this. It’s born of a naivety that there is nothing inherently troubling about carving up and monetising not just our personal data, but our social interactions and our personalities. For the publications and academics that have been closely following Facebook for years, this isn’t news. For everyone else, it is timely wake-up call.”
Where does this leave us today?
Well we could still use Facebook while giving it a minimum amount of personal data, and there are a number of ways to do that.
Then there is the possibility of government regulation, a prospect even Zuckerberg has pondered is needed. Some countries have, or are considering, banning Facebook or limiting access to the data it can capture.
And if either of these things happened Facebook would likely have to change its business model because the micro-targeting it currently relies on would not work at this (reduced) scale.
However, if Facebook were to flip its model fully to a user pay service this raises questions of equity of access as even a small charge would certainly mean many people in developing countries on lower incomes would lose access to the network and the social and business opportunities it does provide them.
But for Facebook it raises a more existential question: The moment people are expected to pay for the service, they will reflect on their experience and ask, ‘Is it really worth paying money for what I’m getting here?’ And that might actually be a crucial moment for many users where they decide “no” – and that might be the final straw that breaks the camel’s back where people might say ‘I’m quitting, I’m not going to pay for this kind of experience’.
For Facebook to switch to a user pay model it would have to actually invest in improving the user experience which would come at the expense of harvesting advertising dollars.
The potential positive in this story might be it kick-starts a renewed discussion about what good can come from connecting people. Maybe this is the moment where we collectively step back and continue that earlier conversation on what social media’s role in society will be.