News

New evidence of Facebook harms must lead to greater oversight of Big Tech

The rapid rise of powerful technology companies to the center of our economic and social life in the last two decades has been dramatic. So too have the problems these companies have caused when their commercial interests come into conflict with social welfare, fundamental rights, and democratic integrity. Critical issues of public health, child safety, election integrity, and more are increasingly shaped by the products and practices of a few Big Tech companies.

In response to these challenges, we have played a role alongside many others from across the political spectrum supporting those organizations and individuals seeking to realign the incentives of the technology industry with the values of democracy. Across the world, we have funded research, education, policy analysis, and public campaigns focused on these questions of technology and democracy. Our conclusion is that the public and regulators must hold Big Tech accountable for building products that work for democracy and not against it. As academic, author, and advisor to Reset Shoshana Zuboff, succinctly states, “The digital must live in democracy’s house.” 

Although we have worked on these issues for years, the scale of internal documented evidence disclosed by the most recent Facebook whistleblower, Frances Haugen, marks an inflection point and a significant opportunity for bipartisan action. Her courage in coming forward to challenge the self-interested decisions of a trillion-dollar corporation is extraordinary. The revelations of Facebook’s unwillingness to act contained in the documents are deeply troubling, underscoring the urgent need for regulatory oversight of Big Tech that will protect citizens’ rights and the fabric of democracy. 

But the public and regulatory debate these disclosures have set in motion is much bigger than one person and one platform. This is about the relationship between technology and democracy in the 21st century, and the lives and welfare of billions of people around the world. What the latest disclosures make clear is that problems associated with the rise of social media – a decline in teen mental health, the rise in cultural and political conflict, the viral spread of conspiracy and disinformation – are well documented and demand accountability. And despite many public claims to the contrary, Facebook, Instagram, and other platforms are well aware of the serious harm their products cause. Facebook’s executives have failed to address these harms effectively because doing so would weaken growth and profitability. There is no going back from these revelations. The actions this evidence demands must be swift, nonpartisan, global, strictly enforced, and encompass all the Big Tech and social media platforms.

What the evidence shows, most critically, is that the torrents of toxic content on social media are symptoms of the root cause, the product design and business model. These digital services are engineered to maximize engagement, that is, to capture our attention to sell to advertisers with little regard for the consequences. The technology is built to manipulate emotions to keep us glued to the screen. 

It is this fundamental product design and the engagement-based business model that we must change – not just at Facebook, but across this industry. This means that the answer is not the regulation of one kind of speech or another. The answer is rules and standards for design features and methods of monetization, regardless of what kind of speech it carries. We must protect freedom of speech while curtailing the incentives to amplify harmful content.

We must not let this moment pass; we must use this latest evidence to drive action.

  • The evidence and issues raised need to be thoroughly investigated -- by journalists, researchers, legislators, and regulators. 
  • We must have public debates in countries around the world about what action should be taken and how to do it. 
  • We must make space to hear from and listen to the communities which have borne a disproportionate amount of the harm created by the hate, division, and disinformation these companies have allowed to spread through their products. 
  • As we have learnt from Ifeoma Ozoma, Sophie Zhang, Timnit Gebru, Yaël Eisenstat, and now Frances Haugen we must have greater transparency from the Big Tech companies. Specifically, we need more data about the relationship between tech products and the public interest – not from whistleblowers but mandated from the companies themselves. We need to open the black box of Big Tech in order to learn:
    • More about how these products are designed, how that design works, where public safety standards may be applied to monitor, measure, and reduce risks and harms to vulnerable users, and how internal product decisions between competing interests are made. 
    • More about how these products rely on endless streams of personal data – gathered through ubiquitous digital surveillance of our daily lives – to profile and target us with content. 
    • The ways in which the largest tech companies use excessive market power to restrict competitors and limit consumer choice. 
  • Based on evidence and democratic dialogue new rules and standards must be established that put public safety, fundamental rights, and democratic integrity ahead of the Big Tech companies' commercial interests. 

It is in this spirit and towards these goals that we are working. We will continue to create a safe environment for and support all those who are focused on tackling these digital threats to democracy and making social media and Big Tech platforms more transparent and accountable. We will directly support efforts that enable a broad public debate about the disclosures and evidence Frances Haugen has brought to light. Collaborating with those who are working on these issues, we want to help ensure that the new evidence of the harms perpetrated by Facebook – along with the evidence of other platform harms – is heard and acted upon.