Menu
Subscribe to Holyrood updates

Newsletter sign-up

Subscribe

Follow us

Scotland’s fortnightly political & current affairs magazine

Subscribe

Subscribe to Holyrood
by Ruaraidh Gilmour
09 May 2024
Harmful content: Can we really keep our children safe online?

Adobe Stock Library

Harmful content: Can we really keep our children safe online?

Six years ago, Aoife downloaded the social media app Yubo on the recommendation of her neighbour. The platform champions “the true essence of being young” by allowing users to chat with others, play games and live stream. 

Around 14 or 15 years old at the time, Aoife began speaking to a man she believed to be her age. As the conversation continued, he suggested they should keep talking away from Yubo, on another social media app. 

She describes herself as “naive at the time”, and she agreed to download the app. At that point, he began asking for sexual images of her. Believing he was who he said he was, Aoife sent them. 
He then revealed to her that he was much older than he previously said.  

“He gave the impression that he was unreachable. He told me that he had a degree in cybersecurity and said if the police were contacted, they would not be able to catch him,” Aoife tells Holyrood

“He threatened that if I didn’t complete his demands then the pictures that I had sent would be sent to my friends and family on Facebook – he had found my Facebook page.”  

Aoife recalls how scared she was. She had been told of the dangers online at school but said “you never think it’s actually going to happen to you”.  

“The thing I remember about that night was it was like a fight to let him [the abuser] let me sleep. He didn’t believe that I had school the next day and he was very much under the impression that he had to give me permission to go to sleep.”  

Aoife says it is “the worst thing that has ever happened” in her life and she was made to feel like she was “the only person in the world”, despite her mum being just down the corridor as she was being threatened. In that moment, she says, it felt like it didn’t matter who she told.  

“I felt like they either wouldn’t believe me, or they wouldn’t understand, I felt so alone and isolated in that moment.”  

She knew she was in a bad situation, one she says she wasn’t sure how she was going to get out of. It was then she remembered vital information that she was told at primary school – The Child Exploitation and Online Protection Command. 

“It was a button that you press which helped me report what had happened to my school. 

“It was two or three o’clock in the morning, he was still messaging me, but I had already put in all my details. The school then contacted my dad, and he then contacted the police, which I am very thankful for.”

Police Scotland kept her phone over the summer as they tried to catch the abuser, but ultimately, they were unsuccessful in tracking him down and gave the phone back. 

A year later, the National Crime Agency (NCA) showed up on Aoife’s doorstep and told her and her family that this man’s online sexual abuse “was a lot bigger than we thought”.  

Jordan Croft had been targeting girls as young as 12 online and blackmailing them into “sexual slavery”. The paedophile from Worthing, West Sussex admitted 65 offences relating to 26 girls and women aged between 12 and 22. Police also found he was in possession of 900 indecent images of children. 

He was jailed for 18 years in November 2022. 

Worryingly, this kind of story is becoming more common. In 2021, analysts at the Internet Watch Foundation (IWF) investigated 361,000 reports of online child sexual abuse and took action against 252,000 URLs which contained images or videos of children being raped and suffering sexual abuse.  

The IWF said: “[Covid] lockdowns saw younger and younger children being targeted on an industrial scale by internet groomers.” The scale of its findings is disturbing, yet it is generally accepted that there is much more that goes unidentified. 

Recent research undertaken by the National Society for the Prevention of Cruelty to Children which covers the period between 2017 and 2023, ending right before the implementation of the UK Government’s Online Safety Act, provides an up-to-date picture of the dangers facing children online.  

It found that at least one in 20 children has experienced online sexual risks or harm. The research also shows that children are more likely to be exposed to content risks than most types of sexual risk. However, children who encounter sexual risks are less likely to tell someone about their experience. 

Tragically, in December last year, a 16-year-old boy took his life just hours after he was targeted by criminals involved in a financially motivated sexual extortion – commonly referred to as “sextortion”. Murray Dowey, from Dunblane, believed he was speaking to a young girl, but he was speaking with a scammer who convinced him to send an intimate images that they then blackmailed him with. 

Sextortion is a crime on the rise. Often carried out by criminals for financial exploitation and coercion, it made up two-thirds of reports to the Revenge Porn Helpline in 2023, as it saw reports double from the year previous. 

And while reporting has increased sharply, it is still not easy for children. Annabel Turner, director of CyberSafe Scotland, a social enterprise that helps protect Scottish children from online exploitation, says that while tech companies need to do “massively more” to safeguard children on their platforms, as well as communicate better in their “response and with law enforcement’s response”, we must not “underestimate the importance of the community response”. 

“Research shows that in 70 per cent of cases, children are more likely to report abuse to a teacher than to a parent or a caregiver. So, it is incredibly important that we continue to build the capacity in our education system to respond to this,” she tells Holyrood

“That must be done carefully because the biggest challenge to child protection and rights online is the enormous gap between children and adults in the digital space. We cannot advocate for children unless we understand what they are experiencing, and the algorithms do an incredible job of further separating children and adults in the digital space. You can have a parent and their child who are both on TikTok and they are seeing exponentially different material.” 

She continues: “We have to encourage parents to have these conversations about sexual abuse online. They will not be comfortable, but we must be comfortable with the level of discomfort, and that must be handled carefully. 

“It can be really helpful to talk to your children through another child’s story. Being able to talk to your child about Murray [Dowey] and what happened to him is incredibly important.” 

Deborah Fry, director of data at Childlight, an organisation based at the University of Edinburgh that shines a light on global child sexual exploitation and abuse, says the “power” that sextortionists have comes from children being afraid that their images will be shared with others “after being groomed into these situations”. 

She talks in a similar way to Turner about the importance of community and education but adds that to further combat against the online abusers “we must disrupt their financial flows” and “attack the source of the problem – the people perpetrating it”.  

Children are trafficked from all areas of the world, but in the last decade, it has come to light that people in Scotland have played a role in sexual abuse remotely, from the comfort of their own homes.  

Matthew Bell, a convicted paedophile from Ayrshire, offered to pay accomplices in the Philippines to sexually abuse youngsters as he watched live on Skype. On one occasion, he paid 93 pence to see a young girl being abused. 

The NCA was first warned about the activities of Bell in September 2016, but he was not arrested until March 2018. Court papers show that he continued to pay to watch the abuse of Filipino children until at least April 2017. 

The NCA say there was initially not enough evidence to arrest him but said it “acted swiftly” when more evidence emerged. However, as they gathered more evidence, he was still able to continue to contribute to the sexual abuse of children. 

Fry believes that the data sharing between parties must improve to prevent cyber-abusers from remaining off the radar of law enforcement. 

“We need to share data better across financial institutions, tech platforms, and law enforcement because they [abusers] will move from platform to platform, having conversations across platforms to try to hide their tracks,” she says. 

“Sharing data will help better protect children, disrupt networks, single individuals, and hold them to account.”  

Both Fry and Turner believe the tech companies that run social media platforms have a duty to better safeguard children who use their products.  

Turner is frustrated with the lack of accountability from tech companies, particularly as children and young people are “spending a lot of money on these products”. 

“They aren’t being protected,” she says. “Take sextortion as an example, there is so much more that these companies could be doing. Algorithmically, children are being connected to adult accounts where the platform must already be aware that those adults represent a risk to those children.  

“Why are girls waking up to so many friend requests from adult men? You can show them how to turn off quick-add functions and things like that, but when you are looking at the scale of children waking up to all these requests, then it’s clear you’ve got a problem with the platform.” 

Fry says in many instances the sexual exploitation of children online is “being swept under the carpet” and we are seeing very little in the way of “proactive searching for child sexual abuse materials” on these platforms. 

“They are kind of washing their hands with it. It’s a case of ‘if we don’t see it, we don’t have to deal with it.’ That puts children continually at risk and gives offenders impunity. And we know that offenders having impunity is a driver of child sexual exploitation and abuse.  

“It has been unhelpfully framed as a privacy versus child protection issue – you can have both.” 
She adds: “I think a big part of this aversion is if you understand what is happening on your platform you may be liable, so I think there is a risk-management element.” 

With the introduction of the Online Safety Act across the UK in October last year, government has demonstrated an appetite to hold social media platforms to account. Among the measures that aim to keep children safer online is one that means tech executives whose platforms persistently fail to protect children from online harm can now face criminal charges and companies can be fined up to £18m, or 10 per cent of their annual global turnover, whichever is greater. 

But despite these tentative positive steps in legislating to make the internet safer for children, Alex Davies-Jones, Labour’s shadow minister for technology and the digital economy, argued that the bill makes “no effort” to future-proof or anticipate emerging harms as new technologies develop. 

One such emerging technology is artificial intelligence (AI). Already it is being used to generate alarming rates of child sexual abuse material, in some cases with a single picture of a child’s face.  
The IWF found that in just one dark-web forum, over one month, 11,108 AI-generated images had been shared, and it was able to confirm that 2,978 of these depicted AI-generated child sexual abuse material. 

There are clear attempts by the Scottish Government to keep on top of new and emerging trends in online child sexual abuse and exploitation. It’s part of Police Scotland’s Multi-Agency Group on Preventing Online Sexual Abuse, which helps to recognise and respond effectively to new trends, such as the financially motivated sexual extortion of young people. But with rapidly emerging technologies like AI, Fry paints a worrying picture. She tells Holyrood: “We have seen this really take off in the last year and we are seeing a big maturation of that content within the timeframe. 

“At first you could tell they were AI-generated, but now we are seeing very sophisticated content where it is virtually impossible to distinguish what is a real image. 

“And there are massive concerns about the speed and ease with which these images are being generated, and there is real worry about what this could mean in terms of flooding law enforcement or creating blockages and delays for identifying victims from child sexual abuse materials.” 

Holyrood Newsletters

Holyrood provides comprehensive coverage of Scottish politics, offering award-winning reporting and analysis: Subscribe

Read the most recent article written by Ruaraidh Gilmour - Transport: A fork in the road.

Get award-winning journalism delivered straight to your inbox

Get award-winning journalism delivered straight to your inbox

Subscribe

Popular reads
Back to top