Online Safety Act comes into effect
The Online Safety Act has come into force, 18 months after receiving royal assent.
As part of the act, Ofcom has published its first illegal harms code of practice and guidance, kicking off “a much wider package of protections”, which will make 2025 “a year of change”, the watchdog has said.
The new guidance marks “a major milestone” in creating a safer online world, introducing 40 new safety measures “explicitly” designed to tackle online grooming and protect children from harm.
Platforms now have three months to review the risk of their users encouraging illegal activity, after which they will have to implement safety measures to mitigate those risks.
Ofcom’s chief executive Melanie Dawes said: “For too long, sites and apps have been unregulated, unaccountable and unwilling to prioritise people’s safety over profits. That changes from today.
“The safety spotlight is now firmly on tech firms and it’s time for them to act. We’ll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year.”
From March, children’s profiles and locations will not be visible to others, should not appear on lists of people users might wish to add, and non-connected accounts will not be able to send them direct messages. All actions, which are believed to be behind the growing number of cases of child online sexual abuse.
Last week, research from the Edinburgh-based Childlight Global Child Safety Institute showed more than 150,0000 children in Scotland had experienced online sexual abuse in the past year.
Under the new measures, tech firms will have to appoint a senior person accountable for its compliance with managing illegal content and will have to make reporting functions easier to access.
Sites will also have to introduce better testing for their algorithms to prevent the spread of harmful content, as well as automated tools called hash-matching and URL detection – which speed up the detection of harmful content - to find child sexual abuse material.
Dawes added: “Those that come up short can expect Ofcom to use the full extent of our enforcement powers against them.”
The new measures also hope to protect women and girls by ensuring apps take down non-consensual intimate ages, also known as revenge porn, and remove posts by organised criminals who are coercing women into prostitution.
Sites and apps are also expected to establish a dedicated reporting channel for organisations with fraud expertise, allowing them to flag known scams to platforms in real-time so that action can be taken.
Posts generated, shared, or uploaded via accounts operated on behalf of terrorist organisations proscribed by the UK Government will also amount to an offence.
Holyrood Newsletters
Holyrood provides comprehensive coverage of Scottish politics, offering award-winning reporting and analysis: Subscribe