Menu
Subscribe to Holyrood updates

Newsletter sign-up

Subscribe

Follow us

Scotland’s fortnightly political & current affairs magazine

Subscribe

Subscribe to Holyrood
Overview: The Online Safety Bill

The Online Safety Bill is now ready to become law

Overview: The Online Safety Bill

Overview: The Online Safety Bill

Last week the Online Safety Bill (OSB) was approved by the House of Lords, passing the last parliamentary stage before becoming law.

The legislation which looks to make the internet a safer place has drawn both support and criticism, building a nation-wide debate on its effectiveness and suitability.

Holyrood takes an in-depth look at the journey behind the controversial bill.

A long and arduous journey

Plans to moderate online content date back to 2019, when the UK government published the Online Harms White Paper. The document listed several measures including the establishment of an independent regulator (with the suggestion this would be Ofcom) to ensure social media platforms and tech companies would adhere to a “code of practice”. The plans addressed a range of issues, from content regarding child sex abuse to self-harm, after the death of Molly Russell in 2017.

Companies would have to produce risk assessments and be proactive rather than reactive to potentially harmful content. Non-compliance could lead to fines or the website being taken down all together. 

In May 2021, a draft of the OSB was first published which was half the length of the current proposal – reflective of how rocky the road has been. Like its white paper predecessor, it named Ofcom as the regulator which would ensure social platforms removed “lawful but harmful” content. But this definition garnered substantial criticism, with many asking ‘why would harmful content be legal in the first place?’

Another treacherous area of the document was in relation to “democratic content”. It said exemptions could be put in place to allow campaign groups to publish content with graphic violence if it was to raise awareness of an issue.

The failure to include online scams within its remit was also criticised, as this is currently one of the most common crimes across the UK.

Overall, the draft proved divisive. It got a warm welcome from child welfare campaigners but was attacked by civil liberties protesters who labelled it a “recipe for censorship”.

Parliamentary hurdles

In March 2022, the bill was finally introduced to parliament. The proposal included various changes such as the right for users to appeal when posts are taken down on freedom of speech grounds, age restrictions on pornography sites, widening the scope of liability to senior management for destruction of evidence, and preventing Ofcom from entering business premises. It proposed a two-year prison sentence for those who broke the law.

Earlier that month, the UK government had also introduced rules whereby social media companies had to prevent paid-for fraudulent adverts appearing on their platforms.

Speaking on the bill, former digital secretary Nadine Dorries said tech firms had “been left to mark their own homework”, and that not passing the bill would “risk sacrificing the wellbeing and innocence of countless generations of children to the power of unchecked algorithms.”

Concerns were raised about Ofcom’s suitability as a regulator. Andy Burrows, Head of Child Safety Online Policy at the NSPCC, said the regulator needed “an up-to-date understanding of constantly evolving online harms” to protect children.

“That’s why it’s crucial there are mechanisms in place for a funded advocate that can rapidly identify new and emerging safeguarding risks,” he added.

Flash-forward to November 2022, then culture secretary Michelle Donelan announced another amendment to the controversial bill, scrapping the “lawful but harmful” duties. Instead, firms should give users tools to hide harmful content they do not wish to see.

The change caused conflict within her own party, with Dorries telling The House magazine her successor had “been in the job five minutes and does not understand enough about it.”

Similarly, Labour shadow culture secretary Lucy Powell also said the change “undermined” the very purpose of the Bill, and would “embolden abusers, Covid deniers, hoaxers, who will feel encouraged to thrive online.”

Meanwhile business secretary Kemi Badenoch welcomed the change after previously stating the bill was not “fit” to become law as it was not the government’s role to legislate “hurt feelings”.

Last January, the bill finally made its way to the House of Lords amid a heated public outcry between tech companies and mental health charity Samaritans, which claimed removing the “legal but harmful” duties would mean losing “a vital opportunity to save lives”.

One of the root causes of division was regarding end-to-end encryption – a security measure that only allows intended recipients of private messages to read them. The bill set out that companies would have to use “accredited technology” to scan messages for child sexual abuse material, a matter which led to WhatsApp threatening to leave the UK market. 

Other amendment proposals came from former digital minister-turned-baroness Nicky Morgan who called for a violence against women and girls code of practice to be included in the bill.

The legislation had its third and final reading on 19 September.

Where are we now?

The bill is now awaiting royal assent to become law.

If platforms do not comply with the legislation, Ofcom could fine the up to £18m or 10 per cent of their global annual revenue, whichever is the highest.

After years of back-and-forth, the legislation will attempt to tackle a broad range of matters including selling drugs and weapons, inciting terrorism, sexual exploitation, hate speech, scams, and revenge porn. Other issues addressed, which are lawful yet potentially harmful, include eating disorder content.

The NSPCC welcomed the bill’s passage, after recently revealing an increase of over 80 per cent in online child grooming cases over the last five years. 

Regarding end-to-end encryption, the government amended the legislation so that platforms would not be required to scan messages where it “ technology has been accredited as meeting minimum standards of accuracy in detecting only child sexual abuse and exploitation content.”

However, companies have said this does not address their concerns on the law’s interaction with security and privacy, with the president of, Signal tweeting: “If the choice came down to being forced to build a backdoor, or leaving, we'd leave.” Others worry this could be “de-facto government surveillance”.

There are also doubts whether technology is widely available and if it’s ethical to use for the implementation of certain parts of the bill such as scanning users for age restriction purposes.

According to government figures, the list of rules will mean over 20,000 small businesses will have to comply, as well as big tech companies like Meta and Snapchat. 

Holyrood Newsletters

Holyrood provides comprehensive coverage of Scottish politics, offering award-winning reporting and analysis: Subscribe

Read the most recent article written by Sofia Villegas - A Year of Innovation and Disruption: Reflecting on tech in 2024.

Get award-winning journalism delivered straight to your inbox

Get award-winning journalism delivered straight to your inbox

Subscribe

Popular reads
Back to top