top of page

Amnesty International Blames X’s Algorithm for Amplifying Hate and Fueling UK Riots After Southport Attack

  • Human Rights Research Center
  • Aug 8
  • 4 min read

August 8, 2025


HRRC strongly condemns X for amplifying hate and misinformation that fueled anti-Muslim and anti-migrant violence after the Southport attack. We urge immediate reforms to its algorithm and safety policies to prevent further human rights abuses.

Riots swept across the UK last summer after the Southport stabbings [Image credit: PA Wire via The Independent]
Riots swept across the UK last summer after the Southport stabbings [Image credit: PA Wire via The Independent]

Following last year's Southport stabbings on July 29, 2024, Amnesty International has said that Elon Musk's social media platform X, formerly known as Twitter, was instrumental in spreading hate speech and false information that fuelled violent anti-Muslim and anti-migrant riots throughout the United Kingdom. The riots erupted after 17-year-old Axel Rudakubana killed three young girls and injured ten others at a Taylor Swift-themed dance class. Far-right influencers quickly circulated false information connecting him to Islam and immigration, even after police confirmed he was born in Cardiff to Christian parents and had no ideological motivation.


Based on an examination of X's open-source recommender algorithm, Amnesty International's study concluded that the site "systematically prioritizes" content that provokes anger without adequate safeguards to prevent harm. Even if they were false, provocative posts went viral before fact-checked information could catch up because of its engagement-first design. Due to inherent amplification biases, posts from premium verified accounts including well-known individuals like Andrew Tate and Tommy Robinson, dominated timelines regardless of their accuracy. In the two weeks following the incident, Robinson's posts alone received over 580 million impressions.


According to the NGO, Musk's 2022 takeover wiped out important safety measures, such as mass moderation staff layoffs, the firing of trust and safety engineers, and the restoration of previously blocked accounts. This followed a recorded increase in hate speech that was still about 50% higher than it was prior to Musk's ownership. Amnesty International emphasized that posts that are likely to provoke a response are given more weight by X's algorithmic approach, especially the "For You" feed, which causes discriminatory and polarizing content to surface sooner than verified information.


Amnesty correlated this design to real-world  implications, as false information fuelled weeks of unrest characterised by arson, attacks on asylum seekers, and vandalism of mosques. Citing recent protests triggered by false online rumours regarding refugee transfers to a London hotel, the group compared X's role under Musk to "petrol on the fire" of racial violence and cautioned that comparable risks still exist today.


In response, X stated that it addresses harmful or misleading messages before they have an impact on safety using human review, machine learning, and its crowdsourced fact-checking function, Community Notes. Amnesty had urged for the establishment of a reparations fund for communities affected by algorithm-amplified hate, more robust measures in the UK Online Safety Act and immediate changes to X's algorithmic design. In order to prevent the transmission of harmful content during times of high stress, regulators such as Ofcom have also cautioned platforms to implement crisis protocols. Amnesty warned that X under Musk will continue to present "serious human rights risks" during times of increased societal unrest unless structural adjustments are made.


Glossary 


  • Adequate – Enough or satisfactory for a particular purpose

  • Algorithmic design – The way a computer program’s rules and steps are planned to decide what content is shown

  • Amplification bias – A built-in preference in a system that makes certain content spread more than others

  • Amplifying – Making something stronger, louder, or more widespread

  • Arson – The crime of deliberately setting fire to property

  • Asylum seekers – People who have left their country and applied for protection in another country

  • Cautioned – Warned someone about possible danger or problems

  • Correlated – When two or more things are connected or related

  • Crisis protocols – Pre-planned steps to follow in an emergency

  • Crowdsourced – Information or work gathered from a large number of people, usually online

  • Discriminatory – Treating someone unfairly because of their race, religion, gender, or other traits

  • Far-right – A political group or view that is extremely conservative or nationalist

  • Ideological – An ideology is a body of ideas, and those who agree with the main idea of something take an ideological stand to support it

  • Layoffs – Ending someone’s job because the employer cannot afford to keep them or no longer needs the work done

  • Machine learning – A type of computer system that improves its performance by learning from data

  • Misinformation – Misinformation is incorrect or misleading information

  • Mosques – Buildings where Muslims gather for worship

  • Open-source – is code that is designed to be publicly accessible, anyone can see, modify, and distribute the code as they see fit

  • Petrol on the fire – Making a situation or problem for worse than it already is

  • Polarizing – Causing people to have completely opposite opinions

  • Provocative – causing anger or another strong reaction, especially deliberately

  • Racial violence – Physical attacks or harm done to someone because of their race

  • Refugee – A person forced to leave their country due to war, persecution, or disaster

  • Reparations – Payments or actions to make up for harm or damage done

  • Robust measures – Strong and effective actions or plans

  • Rumors – Unverified pieces of information passed from person to person

  • Stabbing – An attack where someone is injured with a knife or sharp object

  • Trust and Safety Engineers (T&S) – They work to mitigate risks and ensure user safety on online platforms

  • Vandalism – The act of deliberately damaging property


Sources



© 2021 HRRC

​​Call us:

703-987-6176

​Find us: 

2000 Duke Street, Suite 300

Alexandria, VA 22314, USA

Tax exempt 501(c)(3)

EIN: 87-1306523

bottom of page