A digital tightrope: will the Online Safety Act deliver on its promises?
In today’s online era, getting a mobile phone has become a rite of passage.
But the relationship between children and technology has been linked with issues from a rise in violence, poor mental health and learning outcomes hitting record lows.
Almost seven years have passed since Molly Russell, 14, died from an act of self-harm after suffering the negative effects of online content. Sending shockwaves across the UK, her death sounded a note of caution that things had to change. Unfortunately, since then many children have continued to suffer the worst consequences of the dangers posed by technology. In 2022, Abbie Jarvis, 12, was beaten unconscious in a park in Glasgow as her peers filmed it. And last December, Murray Dowey, 16, from Dunblane, took his own life after falling victim to sextortion.
With two months left until the Online Safety Act comes into force, it seems the bill could punch well below its weight. Passed in 2023, the legislation aims to make the UK “the safest place in the world for our children to go online” by requiring social media firms to put measures in place to prevent youngsters from accessing harmful content.
Since it received royal assent, campaigners have been warning the act does not go far enough. In August the Molly Rose Foundation (MRF) urged the UK Government to “finish the job” after finding that of the more than 12 million content moderation decisions made by six of the biggest platforms, more than 95 per cent were carried out by two sites – Pinterest and TikTok.
And it’s not only research that seems to point out the bill’s weaknesses, but politicians themselves. Earlier this month, newly elected Labour MP Josh MacAlister introduced the Safer Phones Bill which, if passed, would force social media firms to exclude those under 16 from algorithms, suggesting its predecessor may not be fit for purpose.
However, Ofcom is keen to see the act work. Speaking to Holyrood, Stefan Webster, regulatory affairs manager at Ofcom Scotland, says 2025 will mark a “step change” in making social media safer for users.
Last month, the regulator published a timeline, outlining how it will ensure companies comply with the act, once it comes into force in December.
Webster admits it is a “constant challenge” to regulate tech firms yet adds that Ofcom has hired several experts who know “firsthand” what is being developed to tackle this issue.
He continues: “The online world is not a safe place for children at the moment. And what the Online Safety Act does is to address that and ensure the platforms are far safer in terms of the content they're serving children, whether they're being accessed by a smartphone, a tablet or a games console.
“So, what we're talking about is targeting the root cause of harm rather than the devices that people use to access it.”
However, for Jimmy Paul, head of the Scottish Violence Reduction Unit (VRU), legislative change cannot come quickly enough.
“The use of social media changes every few months. Justin Trudeau once said, ‘change has never been this fast, but it will never be this slow again’, and that's certainly true when it comes to how social media is evolving.”
In 2023, the Scottish Government’s Behaviour in Scottish Schools research paper revealed an increase in the problematic use of social media and in the number of pupils using their mobile phones “abusively”. And in September, research by the World Health Organization showed Scottish teenagers aged 13 and 15 reported higher rates of intense social media use compared to those in England and Wales.
Paul adds: “There's action we can take now to support young people.”
In September, the VRU, alongside YouthLink and Medics Against Violence, launched the Quit Fighting for Likes campaign, an online toolkit which aims to get children to think about attitudes around the filming and sharing of violent incidents.
When speaking to six schools while creating the campaign, Paul came across kids who had found violent videos “deeply traumatic” yet admitted that sharing such content had become “incredibly normal”.
“I can't imagine what it's like growing up as a child right now with social media at your fingertips and not being equipped with the knowledge that what you put out there really is there forever,” he says.
Social media platforms often operate with ‘black box’ algorithms – systems that determine what content is pushed to users without revealing their inner workings. This algorithm tends to prioritise engaging content, including harmful material inappropriate for children.
Schools cannot be the panacea to address all social problems
In 2022, the Centre for Countering Digital Hate set up a number of accounts across the US, UK, Canada, and Australia at the minimum age TikTok allows, which is 13 years old. Findings showed that in less than three minutes, TikTok recommended suicide content and every 39 seconds it recommended videos about body image and mental health.
Unsurprisingly, the addictive nature of these algorithms has had a ripple effect across children’s learning abilities and mental health, leading to a wave of restrictions on children’s mobile phone use worldwide.
In 2018, France introduced a total ban on phone use in schools and, in February, England followed suit, publishing guidance that schools should prohibit the use of phones in their premises.
North of the border, a similar movement was pushed by Adam Csenki, a music teacher at a school in Moray. Tired of asking children to “look up”, in May he put forward a petition to the Scottish Parliament, which remains open, to urge the Scottish Government to prohibit phone use in schools.
The petition follows on from research showing phones are severely impacting Scottish education. Earlier this year, the Scottish Secondary Teachers Association annual survey found more than 90 per cent of its members had had to interrupt their lessons to ask students to put their mobile phones away and nine in 10 reported pupils having detachment problems due to phone misuse. These echoed concerns brought by Scotland’s Pisa ratings in December, which fell to the lowest ever level.
The Scottish Government’s recent guidance “stopped short” of replicating England’s approach and instead “empowered” headteachers to make the decision to ban phones themselves as they “know better than anyone the specific approach which will work best in their school”, education secretary Jenny Gilruth said.
“Schools cannot be the panacea to address all social problems”, Anne Keenan, assistant secretary at the Education Institute Scotland, tells Holyrood. “Mobile phones have been used more frequently in schools because there's been a paucity of resources and a lack of investment in the infrastructure in schools. We often hear from members that they have to ask students to use their phones because there's insufficient Wi-Fi within the schools or not enough devices to go around. So, they're left in a position where that's the only course of action to support digital learning.
“They really shouldn't be in that position. The only time that we should be seeing mobile phone use in schools would be where it's underpinned by a strong educational rationale.”
However, technology has become omnipresent, and adopting a too-restrictive approach to it could be “as detrimental” in the long term for kids, Emily Beever, senior development officer at Youth Link warns Holyrood.
“If you banned young people from using any social media up until they were 14, then when they either leave school or they get access to these technologies, they're basically at square one because they don't know how to use it, or how to get help.”
An overly cautious approach to technology use at home can also negatively impact children's health. The internet often serves as a hub of support, encouraging almost half of those aged eight to 17 to navigate difficult times, according to a survey commissioned by the UK Safer Internet Centre.
“There is a generational gulf that exists between parents' understanding of technology and the lives that their children are living,” Joanne Wilson, public affairs manager for National Society for the Prevention of Cruelty to Children (NSPCC) in Scotland, says.
“Children don’t differentiate between the on and the offline world. They congregate in the online world in a way that their parents haven't really experienced.”
For the last few years, the NSPCC has been working with network provider O2 to deliver upskilling sessions to parents, yet she says they are “completely overwhelmed by the demand for those services all across the country”.
She continues: “A parent might think that they're doing the right thing by cutting off the source of the problem, but actually for children now, connecting online can be a place of comfort and the removal of the device can be quite terrifying and may further exacerbate their mental health difficulties. So, it's really complicated but it requires us to listen closely to children's experiences of navigating the online world.”
The Scottish Government placed online safety at the heart of the technology chapter within its curriculum of excellence. Yet its national action plan on internet safety for children and young people dates back to 2017 and, three first ministers, three children’s ministers and three educations ministers later, the strategy still hasn’t been refreshed. Progress is not quick enough.
Holyrood Newsletters
Holyrood provides comprehensive coverage of Scottish politics, offering award-winning reporting and analysis: Subscribe