Feature

Security

International Security Expo: Lessons on the Prevent duty

Ahead of the International Security Expo, Counter Terror Business talks to Erika Brady, from the Handa Centre for the Study of Terrorism and Political Violence (CSTPV) at the University of St Andrews, on the Prevent scheme, monitoring terrorism and understanding radicalisation

You are speaking at this year’s Education Security stream of the International Security Expo on the Prevent scheme. How is Prevent currently embedded in UK universities?

There has been a significant amount of controversy around Prevent, much of which precedes the implementation of the Prevent Duty in 2015. The Prevent Duty itself made it a statutory requirement for various public bodies to contribute to the effort of preventing the radicalisation of vulnerable individuals. The bodies affected by the Duty include educational institutions of all levels. To be honest, if we remove the emotive debate around Prevent, I think most people would not have a fundamental problem with stopping people from becoming terrorists.

The problem, my research tells me, is in how Prevent, as a counter-radicalisation strategy, was developed and implemented. My research has indicated that there has been a clear move by university leadership to abide by the legal requirement and policies have been written and published. However, there is significant variation on how the Prevent Duty and its policies are understood and received by various affected groups such as administrative and academic staff and students. There are also challenges of implementation, and significant push back has been seen from student bodies across the country, much of which has been led by the National Union of Students (NUS) who have organised the opposition to the legislation. In particular, the NUS celebrated the outcome of a High Court ruling in July 2017 that said that universities did not have to follow Prevent Duty guidance which had been issued by the government to assist universities in the implementation of the Duty. The ruling was based primarily on the clarity of Prevent itself as opposed to the guidance presented to assist in its implementation. So, there is some legal precedent that the Prevent Duty was ill-conceived and has been open to criticism.  

What inspired me to focus on this aspect for the conference lay predominantly in my twofold experience, first as a PhD candidate where I am studying CONTEST broadly and aware of many of the policies, and secondly through my experience as a student where I have heard very little about the Prevent Duty. While the University of St Andrews has a robust policy in place, I was unaware of any communications in this regard. So, when I looked into it in more detail, I found that, even as recently as August, staff had been informed of Prevent training which was taking place. So, it is in effect, but never seems to be a big talking point among students, that I am aware of at least. I think it is therefore important to look at it in a little more depth and see if more can be learned about the changes, or lack thereof, over time.

My main issue with the reporting on this, both from media and student bodies, is that it appears to me to be a roll-on effect from the negative reception of Prevent as a whole. The arguments are not new, but are renewing issues raised in the past, and much of the ‘evidence’ is anecdotal. I have no doubt that poor procedure has led to misunderstanding and misrepresentations of radicalisation among students. However, it becomes far more difficult to see, at first glance, whether these issues have been remedied. Perhaps at first there were issues, but universities improved their implementation. Or perhaps they didn’t. So, what I am attempting to understand from my research on the Prevent Duty in universities is what policies have been implemented, what procedures are in place for review, how are communications developed (both to and from students and staff) and is the negativity warranted, even now, three years on.

Given the growth in ‘home-grown’ terrorism in recent years, and the increase in attacks within the UK since 2015, can we deem the CONTEST strategy as having been successful?

This is a good question, and something that is, perhaps unsurprisingly, not easily answered. It all depends on the data you are looking at. For example, if you are including terrorist activity in Northern Ireland, there are indications that that has been consistently high, although to some extent perhaps on the increase over the past few years. The attacks in Northern Ireland don’t often result in mass casualties, however, and so they are less likely to be on peoples’ radar, outside of Northern Ireland. However, if you take the data in the rest of the UK, I would argue that it’s not so much an increasing trend as it is a series of dissociated events which stand out. In the bigger pattern, there were successful events in 2005, 2013, 2016 and 2017. However, the killings of Fusilier Lee Rigby (2013) and MP Jo Cox (2016), although horrifying, were single incidents and did not result in mass casualties, so they need to be understood a little differently. In my view, the idea there is an increasing trend in terror attacks throughout the UK is somewhat misleading.

That being said, I think you can tell a little more about the success or failure of a counter terrorism strategy by what doesn’t happen. By all accounts, several substantial terror plots have been prevented from taking place each year. Over the past few years, more information is being released on these numbers, which gives us some ability to see the extent of the work being undertaken by the security services. So, on some levels, yes, I do believe CONTEST has been successful.

That being said, CONTEST is one of the biggest and most complex strategies being implemented by the government. It has many moving parts, and it would be naive to assume that all of these parts are working at the same level of effectiveness at all times. Not only are some programmes implemented poorly, but changing political, social and global contexts impact terrorist activities and the responses of the authorities. So, a very cursory look at Pursue, for example, might indicate that it is working as well as can be expected given the complexity of the threat we are facing. Conversely, a very cursory look at Prevent gives us the opposite view. Yet that negative impression is not usually informed by data, but by public perception, historical failings and negative media coverage which all perpetuate a notion that Prevent has completely failed. I don’t think this is necessarily the case, but rather it needs to be continually assessed and improved to meet the changing needs and to learn from past mistakes.

On the whole, I think CONTEST is working, but new challenges, such as low-tech terrorism tactics are becoming increasingly challenging to prevent. I think casting CONTEST in a success-failure dichotomy is an over-simplification of a complex problem. Whether CONTEST itself has reduced the threat from terrorism in the UK is an ongoing question, and one I am currently looking to answer through my research.

Partners

View the latest
digital issue