The real lessons from The Social Dilemma
On Facebook’s business model, Google’s governance and the GDPR
It will soon be a year since the premiere of the Netflix documentary The Social Dilemma, which hit a record number of viewers and sparked an unprecedented debate about our relationship with digital platforms. At this point, it is worth looking back and ask ourselves: What issues should we worry about the most, and how should we address them? Are the instruments at our disposal enough? What changes can we expect in the soon future?
In this blogpost, I will take you through three key lessons from The Social Dilemma, how Europe dealt with them and what we can expect in the future- at least on the Eastern side of Atlantic.
The three faces of a larger problem
None of the interviewees in the docufilm could point any single problem with (social media) platforms. It would probably be more appropriate to say that we are facing a compounded problem.
The first layer of the problem is platforms business model. Social media are two-sided platforms, i.e. they can afford offering one side of the market (users) free access to the platform because they charge the other side (advertisers). What do advertisers pay for? They pay for users’ visualization of their ads, and for the probability that this will lead them to buy the product or service that they advertise. The docufilm summarizes it in a sentence: “if you are not paying the product, then you are the product”, or more precisely, your attention is the product. This could still sound as a fair deal if it wasn’t for social media’s deliberate attempts to manipulate users’ thoughts and behavior to secure such attention.
How does this work in practice? Social media and other digital platforms collect immense amounts of data about users’ online behavior (so-called “tracking”), such as screen time, liked pages and recurring routes, to model their preferences and habits and thus be able to sell highly targeted (i.e. user-specific) advertising space to companies. To ensure the effectiveness of targeted advertising (i.e. the probability that users buy what is advertised), platforms must maximize exposure to ads and provide content that is relevant to users. To this end, companies embed principles of behavioral psychology and cognitive sciences in the design of platform features (e.g. email notifications for every tag in a social media post) and develop algorithms that use tracking data to predict what might interest users and rank the content they see accordingly (e.g. recommended videos and news feeds). As the docufilm puts it, social media “use our data against us”, to hook users to their platform (eventually creating addiction in the most extreme cases) and bombard them with ads (which translates into revenue for them).
The second layer of the problem is platform companies’ governance. Until recently, social media and other platform firms failed to understand, acknowledge, or intervene to mitigate the negative effects of the manipulation mechanisms they had put in place. Originally designed for commercial purposes (sell targeted ads), manipulation mechanisms (design features and algorithms) can be purposefully exploited by malicious actors to meddle in elections, spread fake news, and encourage hate speech. They can be used by authoritarian governments to crack down protests and haunt opponents. And they can psychologically damage individuals, causing addiction, isolation, loss of confidence, depression, sometimes with spillover effects on behavior, such as self-harm and, in the most dramatic cases, suicide.
As former Google employee Tristan Harris lays out in The Social Dilemma, “never before in history have 50 designers, 20 to 35-years-old white guys in California, made decisions that would have an impact 2 billion people”. Calls by employees like Harris to take responsibility for and address this issue were long ignored or downplayed by companies’ management. In some cases, original developers were already aware of the manipulation power of social media, as pointed out by former Facebook and Pinterest employee Tim Kendall, but they had accepted it as an integral part- if not even the key to the success- of their business model. And with no regulation, safeguard, minimum requirement, impact assessment or independent oversight mechanism in place, no one, either individual or authority, could hold them accountable for their way of doing business and its consequences.
The third layer of the problem is users’ lack of awareness. The dynamics described so far were long carried out without users’ knowledge or at least without their informed consent. According to Shoshana Zuboff, Harvard professor and author of The Age of Surveillance Capitalism, it is on this lack of transparency that the “most profitable companies of human history” have built their fortune.
Does the GDPR solve these issues?
The EU General Data Protection Regulation, in force since 2018, empowers internet users through a novel and specific set of rights that give them control over their personal data. It also regulates companies’ use of personal data by establishing the conditions they must meet to lawfully process such information and relevant sanctions to hold them accountable if they fail to respect those conditions. In particular, the GDPR addresses the problems outlined in The Social Dilemma in the following ways:
- With its broad and holistic definitions of “personal data” and “data processing”, the GDPR effectively covers the various types of data treatment that platform companies carry out;
- With its distinction between “processors” and “controllers” and its definition of “third party” the GDPR also effectively covers the use of data by advertisers on social media and other platforms;
- By tying its territorial scope to the location of the data subject and not of the processor, controller, or third party, the GDPR effectively applies to companies based abroad, including those in the US;
- By demanding “data minimization”, “purpose limitation” and “storage limitation”, the GDPR constraints platforms’ power over users’ data and forces them to be transparent about what use they make of such data;
- By demanding companies to ask for the free, explicit and informed consent of data subjects and by mandating the provision of clear and accessible information and procedures for users to withdraw such consent and exercise their rights, the GDPR forces platforms to notify data subjects about the use they make of their data and to give them the possibility to object. Such rights include access, rectification, erasure, restriction of processing and portability of one’s personal data, the right to object to individual automated decision-making (including profiling) and the rights to judicial remedy and compensation;
- By establishing obligations for data processors and controllers and making them liable for infractions, the GDPR forces companies to take legal responsibility for their misuse of data and to pay fines if they fail to comply by the regulation;
- By establishing national and European data protection authorities and demanding firms to keep the records of their processing activities and notify data breaches to both data subjects and competent agencies, the GDPR establishes mechanisms to make platform firms accountable.
A couple of takeaways and a glimpse at the future
To sum up, The Social Dilemma accuses digital platforms, in general, and social media, in particular, of:
- Monetizing user data without users’ knowledge nor informed consent
- Manipulating user behavior for commercial purposes
- Lacking transparency and accountability for harmful real-life consequences of platform use
Is the GDPR enough to protect us against these practices? Definitely not. The GDPR does address the problems of transparency, informed consent, and accountability. However, it does not tackle the issues of manipulation or monetization of users’ data. As long as users give their consent for the sake of convenience and marketers find some lawful basis envisioned by the GDPR for data processing, these dynamics will persist. So, should we simply surrender to the idea that this is the way the digital economy works and that we must accept its excesses if we don’t want to be excluded from it? Not necessarily. In the EU two important new pieces of legislation are being discussed which could have a great impact on digital platforms and our relationship to them: the Digital Services Act and the e-Privacy regulation. It remains to be seen whether they will succeed in re-writing the rules of the game.