top of page
Joshua Sallos-Carter and Lauren Salim

Facebook's algorithms are hungry for Trump, but is he to blame for violent extremism online?

Authors: Joshua Sallos-Carter and Lauren Salim, Montreal Institute for Genocide and Human Rights Studies (MIGS)


February 13, 2023


[Image credit: Getty Images]

Two years after his ban from Facebook, former president Donald Trump is returning from the fringes of cyberspace and permitted back on the platform. In a climate of growing far-right populism, what are the implications for free speech and democracy?


The January 6 Committee appointed to investigate the causes of the riot found Trump to be directly responsible. The former president who faced impeachment twice during his term used digital platforms, like Facebook and Twitter, and in person rallies to spread false information as "alternative media" and use manipulative rhetoric to incite supporters to violent behaviour. On the morning of the Capitol Riot, Trump told supporters the election was "stolen", that " we will never take back our country with weakness", and while preparing to march on the Capitol Building lamented, "if you don't fight like hell, you're not going to have a country anymore".


Facebook and other digital platforms found Trump guilty of terms of service violations relating to incitement of violence, however, they have recently made the decision to lift the ban in the wake of U.S. election season 2024. When the decision was made to remove Trump, Facebook placed the blame for inciting the Capitol Riot squarely in Trump's court, but it raises the question, how innocent are the platform owners themselves? And what role should private entities managing so-called "digital town squares" play in directing our collective social lives?


On October 5th, 2021, former Facebook data engineer and whistleblower Frances Haugen testified in front of the US senate that "machine-learning models (used by Facebook) that maximize engagement also favor controversy, misinformation, and extremism," and that "64% of all extremist group joins are due to (Facebook's) recommendation tools". In other words, Facebook knowingly profits from digital extremism. The frightening implication is that Trump is not the cause of rising extremism and social stratification in the United States, but rather a symptom of systemic for-profit business interests.


Canadian parliamentarian Pablo Rodriguez is right when he says that, "There's a consensus on the fact that we have to do something." Policies instituting platform self-policing have given legislators some bite in dealing with extremist content on the largest platforms, like the recent ultimatum levied against Twitter by the EU, but these types of solutions don't go far enough in addressing the root of the problem—there will always be a new Twitter and a new Trump. To quote author and activist Kimberly Jones, "as long as we're focusing on the what, we're not focusing on the why".


A 2021 study in the Cambridge Journal of Regions, Economy and Society suggests a connection between spatial socioeconomic features and behaviours defined as "cyberhate". Communities with greater income stratification and features of social inequality are more likely to be associated with harmful extremist behaviours. In addition a 2021 research paper Social Science Computer Review found participants of extremist behaviours online are more likely to support populist leaders, like Donald Trump, and subscribe to "alternative media".


We are in the midst of an unprecedented housing crisis, an opioid epidemic, and several communities don't even have access to clean drinking water in Canada. A 2014 economic analysis of 300-years of data reveals exponentially increasing wealth stratification, meaning increased poverty, growing social unrest, rising levels of extremism, and new waves of populist leaders finding platform to incite violence.


This International Day for the Prevention of Violent Extremism as and when Conducive to Terrorism, let's remember, in the words of Secretary-General of the UN António Guterres, the importance of addressing the underlying conditions that cause young men and women to be lured by terrorism. Let's work together to address the root causes of poverty, social inequality, and rising extremism. There can be no free speech or democracy without first finding some way to bridge the growing political and socio-economic chasm that enables radicalism in the first place. Acknowledging that process will take time, we must in the interim ensure that platform owners are held accountable for their role in creating a digital environment conducive to the spread of extremist ideologies and advocate for more transparency into the function of algorithms so citizens are aware that the content they are seeing is warped to show them increasingly polarizing information.


 

This article is part of the Digital Peace Project, organized by the Montreal Institute for Genocide and Human Rights Studies (MIGS) thanks to funding from the Department of Canadian Heritage. Joshua Sallos-Carter is a student fellow at MIGS focusing on hate speech and emerging tech. Lauren Salim is a project leader at the institute.

Commentaires


bottom of page