Feature

Cyber Terrorism

Regulating terrorist content online: considerations and trade-offs

Regulatory strategies
The final consideration will look at the range of capabilities and resources, as well as motivations of the platforms, file-sharing sites and instant messenger services to counter this issue. The major platforms hire thousands of employees globally to work on safety and securityxx, bring in large advertising revenuesxxi, and claim to invest heavily in an array of artificial intelligence technology that proactively searches for and removes terrorist contentxxii, despite the finding that this content can still be easily found on their sitesxxiii. This is not typical, however, of smaller file-sharing sites, alternative platforms, and instant messenger services. Such services often only have a handful of employees, are based in just one country, and rely on strategies other than the selling of advertisements for revenue.

For example, the file-sharing site JustPaste.it does not have the resources and capabilities of larger platforms. However, it works with Tech Against Terrorismxxiv and the Global Internet Forum to Counter Terrorismxxv to try to tackle terrorist use of their sitesxxvi. Other platforms and sites are less cooperative with a heavy focus on free speech and user privacyxxvii. As a result of this variation, some regulatory strategies and penalties may be easily implementable for some companies but create struggles and vulnerabilities for others.

Regulatory strategies are therefore going to have to focus on more than just the major platforms, fast content removal, and content that is typically considered violent or inciting. If regulation focuses on the major platforms – or platforms with a certain size user-base – then they will only be focusing on one small part of the online ecosystem used by terrorist and extremist groups. If regulation focuses on violent and inciteful content then groups will be able to continue to post non-violating content and URLs signposting followers elsewhere. It may also lead to the whack-a-mole effect of groups migrating to sites that are less censored and more difficult for law enforcement to monitor. If regulation focuses on fast removals then companies with an enormous volume of content and/or very few resources may feel pressured to err on the side of caution and make removal errors. Finally, regulation that focuses on fining companies who do not comply risks the major platforms treating them as just another business cost and may be difficult to enforce on companies that are registered outside of the jurisdiction.

Regulation has recently moved in the direction of requiring social media platforms to proactively remove terrorist and extremist content as quickly as possible. However, this article argues that regulation should be acknowledging several considerations. First, regulation needs to include the enormous range of platforms (both large and alternative), , file-sharing sites, and instant messenger services that are  used by terrorist and extremist groups instead of focusing on the major platforms, or platforms with a certain number of users. Secondly, they will need to address that the platforms and sites are not used homogeneously and therefore may require different strategies to one another.

Third, they need to consider the differing levels of resources available to the companies, as well as their motivations to comply, and whether penalties, such as fines are going to have any success in incentivising compliance. Finally, regulation should acknowledge that although content removal has benefits, there are also trade-offs to this. Therefore, a one-size-fits-all approach to regulation is unlikely to be effective; having a range of regulatory strategies to draw from should give more flexibility and allow for a more tailored approach depending on the range of important differing factors that have been raised throughout.

Amy-Louise Watkin is a PhD Candidate at Swansea University and a member of Cyber Threats Research Centre.

Joe Whittaker is a joint PhD Candidate at Swansea University and Leiden University. He is also a research fellow at the International Centre for Counter-Terrorism and a member of the Cyber Threats Research Centre.

www.swansea.ac.uk/law/cytrec/

Partners

View the latest
digital issue