Mix

How social media is fuelling the far-right violence in the UK

In recent days, the UK has been beleaguered by large-scale racist violence. The unrest was catalysed by a riot that took place in Southport on July 30, following a violent attack in the town on July 29 where several children and adults were stabbed at a Taylor Swift-themed dance class. Tragically, three young girls died of their injuries, while eight other children and two adults were wounded.

Shortly after the attack, disinformation began circulating on social media surrounding the perpetrator’s identity. The suspect has now been identified as Axel Rudakubana, a 17-year-old born in Cardiff to Rwandan parents, but in the wake of the incident unsubstantiated rumours circulated online which asserted that he was an ‘illegal’ Muslim migrant.

These false claims have acted as a catalyst for far-right extremists to call on people to take to the streets. In total, between 200 and 300 rioters descended on Southport on July 30, hijacking a vigil in memory of the victims – with many travelling to the town from across the country to stoke the disorder further. Since the Southport riot, far-right mobs from Belfast to Birmingham have continued enacting racist violence. In recent days perpetrators have set fire to accommodation housing asylum seekers, laid siege to mosques, looted shops, and damaged public buildings and private homes alike. Additionally, many people from Muslim and British-Asian backgrounds have been assaulted.

Disinformation has long been an issue on platforms such as X (formerly known as Twitter), but the problem has got far worse since Elon Musk’s acquisition of the company. In the wake of the Southport attack, former GB News presenter Laurence Fox wrote on the platform: “Enough of this madness now. We need to permanently remove Islam from Great Britain. Completely and entirely.” Despite inciting violence against British Muslims, this post has not yet been deleted. Andrew Tate, the self-proclaimed misogynist with nearly 10 million followers on X, claimed the attack was carried out by “an illegal migrant [who] arrived on a boat one month ago”. Though patently false, this post has also not been deleted. Dr Marc Owen Jones, an expert in digital authoritarianism, estimates that posts speculating that the attacker was Muslim received 27 million impressions on X. 

Joe Mulhall, director of research at the anti-fascism organisation Hope not Hate, lays part of the blame for the recent riots at Musk’s door. “The likes of Tommy Robinson and Andrew Tate have been able to spread misinformation and sow the seeds of division [on X],” he says. “A few years ago, these far-right individuals were banned from X but since the Musk takeover they have been allowed back onto the platform.” On X today, it seems like anything goes: you can post something inflammatory, hateful, racist, or just plain untrue, and Musk will let it slide.

While the billionaire CEO has long expressed his desire to turn the platform into a supposedly nonpartisan “digital town square”, it’s clear he has his own agenda and harbours sympathy for the far-right: in recent days, he has harassed Prime Minister Keir Starmer on the platform, repeatedly asking him to extend his concern to “all communities” – a blatant racist dogwhistle – and posted “#TwoTierKier”, in reference to the baseless conspiracy theory that white far-right rioters are the victims of a ‘two-tier policing’ system that treats them more harshly because of their race. (In reality, the head of Britain’s police chiefs recently described the force as “institutionally racist”.)

It’s not just X – many of these far-right groups have been organising via Telegram, an instant messaging platform co-founded by exiled Russian billionaire brothers Pavel and Nikolai Durov in 2013 in response to growing censorship in Russia. Initially, the platform wasn’t too dissimilar to other messaging apps like WhatsApp, but in recent years it has become more of a social networking site. “As well as communicating one-to-one, users can join groups of up to 200,000 people and create broadcast ‘channels’ that others can follow and leave comments on,” writes Guardian technology editor Alex Hern, adding that the private nature of Telegram limits how closely content on the app can be moderated. “A chat can have 100,000 members, but if it’s a private group, then no one outside will even see infringing content to report it.” In any case, Pavel Durov has previously expressed his belief that “our right for privacy is more important than our fear of bad things happening, like terrorism.”

Mulhall explains that Hope not Hate have been calling for action against Telegram for many years. “Tragically, the lack of action against the platform has allowed it to remain a hub for violent extremists and it is our communities that are paying the price,” he says. “Over the past week we have once again seen how Telegram has been used by extreme figures to spread hate and organise events that have resulted in horrifying violence on the streets of the UK.” He adds that certain Telegram groups have played “a central role” in directing individuals to attack mosques and accommodation housing asylum seekers, and that he has seen death threats issued towards certain politicians circulating on the platform. 

Racism has been an issue in Britain since time immemorial, and these recent spates of violence are also a direct result of the social atomisation triggered by austerity and the casual racism and Islamophobia espoused by mainstream journalists and politicians. But social media has evidently made the problem worse by creating a space for racists to spew vile rhetoric, spread disinformation and conspiracy theories, and, crucially, communicate with one another privately.

It’s clear something needs to change – and last week, Starmer stressed that he would be cracking down on the spread of far-right extremism online. “Let me also say to large social media companies and those who run them: violent disorder was clearly whipped up online,” he said. “That is also a crime. It is happening on your premises, and the law must be upheld everywhere.”

The new Online Safety Act requires social media companies to introduce new protections for child safety and make them responsible for preventing and removing illegal content, but what’s less clear is how companies should tackle misinformation and inflammatory language more broadly. In any case, given that Musk replied “Insane” to Starmer’s comments on the need for social media companies to take more responsibility for the content they publish, it seems depressingly likely that extremism will continue to flourish online until further action is taken and the problem is addressed at the root.

  • For more: Elrisala website and for social networking, you can follow us on Facebook
  • Source of information and images “dazeddigital”

Related Articles

Leave a Reply

Back to top button