Feature
Teenage girl on her phone at a desk.

Extremism and social media: what can be done?

We take a look at how the world of social media has become a platform for terrorists and extremists to influence others, and explore what needs to be done to stop it.

Social media and an ever-growing digital world has had a positive effect in many ways. It can help people reconnect and forge new relationships, as well as helping businesses to grow.     

However, it has also created an increase in loneliness and poor mental health, especially for the younger population.     

Hannah Rose, an analyst with the Institute of Strategic Dialogue (ISD) thinktank, told The Guardian there had been a “surge in online extremist ecosystems” for several years, which remain “very easy for children to access”, and that “offline vulnerabilities, which kids are more likely to have, can make somebody more prone to adopting extremist views”.     

Social media platforms like Telegram are one of the many ways extremists can meet, collate dangerous materials and influence others.

From socials to the streets

News stories of young people and adults found guilty of preparing acts of terrorism or taking part in extremist activities online have become more common over the past 20 years.     

For example, the riots that began in Southport and spread across the UK were spurred on by misinformation spread online.

When false information circulated online about the identity of the attacker, many were quick to take the claims as truth.     

This is direct proof that information on social media has the ability to influence many people, regardless of whether it is true or not.     

Another example of acts of terrorism being prepared with help from digital resources comes from Liverpool earlier this year.     

Then 20-year-old Jacob Graham was convicted of numerous offences, including Preparation of terrorist acts, two counts of dissemination of terrorist publications and four counts of possession of material likely to be useful to a terrorist.     U

pon investigation, officers reviewed his media devices and identified that he had collected a huge number of manuals, instructions and publications, which focused on providing instructions for the manufacture of firearms, ammunition and explosives; some of which were printed out and stored in a folder in his home.     

He was also found to have constructed a document entitled ‘Freedom Encyclopaedia’ and shared it with contacts over the internet. This was a manual filled with instructions on how to build weapons, including shotguns, nail bombs, explosives, including Black Powder (also known as gunpowder) and plastic explosive; ignition devices and instructions on how the perpetrators might evade the police.     

Superintendent Andy Meeks of Counter Terrorism Policing North West (CTPNW) said: “Online extremism is a growing threat and this case sadly is a prime example; where a young man from Merseyside has become radicalised online, without ever having left his bedroom.     

“He shared extreme content online recklessly and without any regard for who received his instructions or for what purpose. He even went so far as to say he intended this material to be instructional to other terrorists.”

What can be done?

It can seem that social media does more harm than good, but it is no good pretending it doesn’t exist. Instead, we need to make sure there are suitable protections in place so users are able to stay safe online and avoid dangerous content as much as possible.     

One way the UK government has tried to do this is by introducing the Online Safety Act in October 2023. This piece of legislation aims to protect people online, particularly children, and urges tech firms to manage their platforms’ content more.

The bill enforces these companies to protect children and young adults from harmful and illegal material. This includes content that shows terrorism, coercive or controlling behaviour, and incites hatred and violence.     

However, many organisations and experts have criticised the Act for not being strict enough, as well as voicing concerns over limiting freedom of speech.

Full Fact, for example, said that the Act “does not address health misinformation, which the Covid-19 pandemic demonstrated could be potentially harmful.”     

They also added that the Act does not “set out any new provisions to misinformation that happens during ‘information incidents’ when information spreads quickly online, such as during terror attacks or during the August 2024 riots following the Southport murders.”     

Much of the pressure falls to the social media platforms themselves to keep their users safe, and implement proper moderation.     

Following the Southport riots, Adam Hadley, executive director of Tech Against Terrorism, added: “Telegram and X must take immediate action to improve their content moderation and proactively manage their platforms to prevent further spread of violent disinformation. Both platforms bear some moral responsibility for the risk of physical violence resulting from unchecked extremist activity.”

Partners

View the latest
digital issue