Experts have warned that teenagers as young as 13 are being suspected of involvement in terrorism after being exposed to a toxic cocktail of far-right extremism easily accessible online.
Insiders describe “a horrible soup of hate” of social media content where children can “pick and mix” terrorist narratives, including the Terrorgram network – recently banned in the UK – of white supremacist channels on Telegram.
Experts have followed 49 children convicted of terrorist offenses since 2016 (all but one are children) and this week Ken McCallum, director of MI5, said that “13% of all those investigated by MI5 for their involvement in UK terrorism United Kingdom are under 18 years of age”, a three-fold increase. in three years.
But the growing proportion of children under scrutiny also raises problems, with questions raised over whether teenagers should be criminalised, and MI5 and experts acknowledge that cases often raise mental health or recruitment issues.
Hannah Rose, an analyst at the Institute for Strategic Dialogue (ISD) think tank, said there had been a “rise of online extremist ecosystems” for several years, which remain “very easily accessible to children,” and that “ “Offline vulnerabilities, which children are more likely to have, can make someone more likely to adopt extremist views.”
In April, Britain outlawed Terrorgram, a casual online neo-fascist group. The coordinators helped run three Telegram channels with more than 70,000 subscribers, an ISD study found in August, which were purportedly news channels, one of which claimed to be associated with Steve Bannon's Battle Room podcast.
The channels are not overtly linked to Terrorgram, but all three pushed readers or subscribers around the world to join related group chats where, the ISD said, “content supporting mass violence and social collapse abounds,” and researchers They consider that children are potentially more susceptible to violent influence.
Researchers consider some of the online radicalization to be a hangover from the Covid period, when young people and families were forced to spend more time online. But it has also led to a confusion of concepts and motivations, sometimes making it difficult for researchers to see what ideology, if any, is involved.
“We are also seeing the incorporation of concepts like extreme online misogyny or school shooter fandoms,” Rose said. “Some of this is aestheticized online with fast graphics or music, presented as a computer game, and one result is that people without an ideology are motivated to carry out acts of extreme harm.”
Reflecting online culture, Terrorgram is largely a versatile content-sharing association. Although two of its alleged leaders, both Americans, were charged in September with 15 counts of hate crimes, soliciting the murder of federal officials and conspiring to support terrorists by US authorities, it is expected to evolve and endure.
Cases of child terrorism are largely linked to far-right networks, but there has been a resurgence of Islamist activity. Last month a 15-year-old boy from Nottingham was given three years' community behavior and youth referral orders for sharing violent videos linked to the Islamic State and pledging allegiance to the terrorist group on Telegram.
MI5 was previously found investigating a 13-year-old boy from Cornwall who became a leader of the British arm of the banned neo-Nazi Feuerkrieg Division, an online group that was created in 2018 by another 13-year-old boy from Estonia. .
The British boy was sentenced to a two-year non-custodial rehabilitation order in 2021, while one of his recruits, 17-year-old Paul Dunleavy, was sentenced in November 2020 to five and a half years for helping to prepare acts. of terrorism. Although Dunleavy's terrorist efforts were described as “inept,” he advised other members how to use firearms, some of whom were convicted of terrorist crimes abroad.
Adam Hadley, chief executive of Tech Towards Terrorism, which aims to disrupt extremist groups online, said there had been “significant job cuts” at social media companies, particularly in “trusted content moderation and safety.” which has helped extremist content escape their attention. censored.
Experts say extremist material is most prevalent on X, which has reduced content monitoring since it was acquired by Elon Musk, and on Telegram. Last month, its boss, Pavel Durov, announced he would look to improve moderation on the network after being arrested in France amid an investigation into its use of sharing images of child sexual abuse and drug trafficking.
Questions have also been raised about whether terrorism prosecutions are an appropriate way to address cases of child radicalization. Last month it emerged that MI5 had been monitoring Rhianan Rudd, a teenager who had taken her own life after being groomed by an American far-right extremist and accused of having received instructions to manufacture firearms and explosives.
In a hearing prior to the investigation, the Security Service said it was not involved in the police decision to charge her with terrorism offences, a process that was later suspended. A lawyer representing Rudd's family said that “it appears that she was being monitored by MI5 while she was subjected to online exploitation”, and her family has argued that she should have been treated as “a victim and not a terrorist”.
McCallum appeared to reflect those concerns Tuesday, when he attempted to differentiate between types of terrorism cases involving children. “For those planning attacks, a criminal justice outcome is usually needed. But for some vulnerable people, alternative interventions delivered by a wider range of partners may be more effective,” the MI5 chief said.
HBh">Source link