Home / OPINION / Analysis / The Role of Social Media in Fueling Racist Riots: A Call for Urgent Regulation

The Role of Social Media in Fueling Racist Riots: A Call for Urgent Regulation

Diana Casey

The recent riots in the United Kingdom, sparked by disinformation and incendiary content on social media, underscore the destructive power that unregulated digital spaces can wield over public order and communal harmony. In an era where social media platforms offer unprecedented connectivity, they have also become breeding grounds for the rapid spread of hate speech and violent propaganda. This op-ed delves into the case of Bobby Shirbon, an 18-year-old whose involvement in the riots was driven by the unfiltered and inflammatory content he encountered online, illustrating the broader implications of social media’s role in exacerbating such violence.

Bobby Shirbon: A Symptom of a Deeper Problem

Bobby Shirbon, a young man from Hartlepool, became entangled in controversy when he joined a mob targeting asylum seekers’ homes during the UK riots. Shirbon, who had just left his birthday party, was arrested after smashing windows and hurling bottles at police officers. His defense? “Everyone else is doing it.” This disturbing statement reflects a larger issue: the normalization of violent behavior on social media, where real-time footage of brutality can foster a sense of collective action and justify illegal conduct.

Shirbon’s involvement in the riots wasn’t an isolated incident but rather a manifestation of how social media can influence human behavior. Alerts on his phone, likely triggered by misinformation and provocative content about events in Southport, drew him away from his party and into the chaos on the streets. This case demonstrates how quickly social media can warp an individual’s perception of reality and lead them down a dangerous path.

The Power of Unregulated Content: Social Media’s Influence on Violence

The unregulated nature of social media platforms has turned them into fertile ground for the spread of violence and hate. Unlike traditional media, which adheres to strict standards of accuracy and responsibility, social media operates with little oversight. This lack of regulation has resulted in a flood of graphic content, often presented without context, that can desensitize viewers and drive them to participate in or endorse violence.

Platforms like X (formerly Twitter), under the ownership of Elon Musk, have exacerbated this issue by promoting features that make it easier to consume such content. Musk’s decision to remove content filters and implement an infinite-scroll feature for videos has led to a surge of disturbing images and clips on users’ timelines. These include videos of gang fights, road rage incidents, and other acts of violence, often accompanied by incendiary captions designed to provoke outrage or fear.

During the UK riots, social media was flooded with violent videos, including an unrelated machete fight in Southend, which were presented in ways that fueled further violence. Musk’s own comments, speculating about the possibility of a “civil war” in the UK, were seen by millions and contributed to the growing sense of instability and division.

Algorithmic Amplification: A Systemic Issue

The algorithms that govern social media platforms play a significant role in amplifying violence and hate. These algorithms prioritize content that generates high engagement, which often means promoting material that elicits strong emotional reactions. Unfortunately, this results in harmful and misleading content taking precedence over more balanced and accurate information.

Dr. Kaitlyn Regehr, co-author of the “Safer Scrolling” report, notes that social media companies are primarily in the business of capturing attention. Harmful and sensational content is more likely to grab users’ attention than nuanced, factual news, leading to a dangerous cycle of radicalization. According to Regehr, a closer examination of the social media feeds of individuals involved in the UK riots could reveal patterns linking their online consumption to their real-world actions, highlighting the systemic nature of the problem and the urgent need for regulatory intervention.

The Need for Stronger Regulation: Lessons from the UK Riots

In response to the recent riots, the UK government is considering measures to strengthen the forthcoming Online Safety Act, which aims to hold tech companies accountable for the spread of illegal and harmful content on their platforms. However, experts like Regehr and Professor Shakuntala Banaji of the London School of Economics argue that the legislation may need to be significantly more robust to address the full scope of the problem.

Banaji’s research emphasizes the global nature of this issue, showing that the spread of violent, context-free videos has contributed to racial violence in countries such as India, Myanmar, and Brazil. She points out that the political context in which such content is packaged often determines its impact. In the UK, the post-Brexit political climate, characterized by rising Islamophobia and anti-immigrant sentiment, has created fertile ground for the kind of violence witnessed during the recent riots.

Banaji advocates for independent regulation of social media platforms and a political discourse that explicitly condemns racism and hate speech. She argues that such an approach is necessary to counteract the power of algorithms that currently amplify toxic content.

Conclusion: The Urgent Need for Action

The UK riots serve as a stark reminder of the dangers posed by unregulated social media. Bobby Shirbon’s story illustrates how quickly individuals can be drawn into violent behavior by the content they encounter online. As social media continues to evolve, it is imperative that governments and regulators take swift action to mitigate the risks associated with these platforms.

The Online Safety Act is a crucial step toward holding tech companies accountable for the content they host, but much more needs to be done. Strengthening the Act to address the specific challenges posed by misinformation, hate speech, and algorithmic amplification is essential. Additionally, political leaders must adopt a more responsible and inclusive language that does not implicitly endorse division and hatred.

As we move forward, it is essential to recognize that the technology that connects us also has the potential to divide and destroy us if left unchecked. The events of the past weeks should serve as a wake-up call, underscoring the urgent need for a more regulated and responsible digital world.