Select the directory option from the above "Directory" header!

Menu
UK's controversial online safety bill set to become law

UK's controversial online safety bill set to become law

The Online Safety bill, now passed by Parliament, has stirred criticism regarding provisions that will require tech companies to monitor encrypted messages.

Credit: Shutterstock

Four years after it started life as a white paper, the UK government’s controversial Online Safety Bill has finally passed through Parliament and is set to become law in the coming weeks.

The  bill aims to keep websites and different types of internet-based services free of illegal and harmful material while defending freedom of expression. It applies to search engines; internet services that host user-generated content, such as social media platforms; online forums; some online games; and sites that publish or display pornographic content.

If companies do not comply with the bill's  rules, UK regulator Ofcom could fine them up to £18 million (US$22 million) or 10% of their global annual revenue, whichever is biggest.

The government has already been working closely with Ofcom to ensure changes will be implemented as quickly as possible when it becomes law, according to the Department for Science, Innovation and Technology. Ofcom is set to launch its consultation process once the bill has Royal Assent the formal process by which the King agrees to make the bill into an Act of Parliament — taking a phased approach to bringing the Online Safety Bill’s into force.

“Our common-sense approach will deliver a better future for British people, by making sure that what is illegal offline is illegal online. It puts protecting children first, enabling us to catch keyboard criminals and crack down on the heinous crimes they seek to commit,” said Michelle Donelan, secretary of state for Science, Innovation and Technology, in comments published after the bill’s passing.

Why is the Online Safety Bill so controversial?

While proposals to keep internet users safe from fraudulent and other potentially harmful content and prevent children, in particular, from accessing damaging material, have been widely welcomed, people across the political spectrum have been less than thrilled about a clause inserted by the government in the summer of 2022. This amendment would have forced tech companies providing end-to-end encrypted messaging to scan for child sex abuse material (CSAM) so it can be reported to authorities. 

In response, around 70 UK information security and cryptography researchers signed an open letter strongly opposing the bill, raising concerns over its interaction with security and privacy technologies.

Earlier this month, the government tried to sidestep the issue by adding an amendment to the bill that stated companies will not be required to scan encrypted messages until it is "technically feasible and where technology has been accredited as meeting minimum standards of accuracy in detecting only child sexual abuse and exploitation content.”

However, experts that campaigned on the issue have said this amounts to the government kicking the can down the road and doesn’t address any of the privacy concerns that stem from legally requiring companies to scan encrypted messages.

In an update posted on X, the social media platform formerly known as Twitter, Merideth Whittaker, President of Signal, wrote: “Signal will never undermine our privacy promises the encryption they rely on. Our position remains firm: we will continue to do whatever we can to ensure people in the UK can use Signal. But if the choice came down to being forced to build a backdoor, or leaving, we we'd leave.”

UK gov't pressures Meta on encryption

This week, the government launched a campaign against Meta’s plan to encrypt messages sent via all the company’s social media platforms, urging the rollout to be paused until a safety plan is put in place to detect child abuse activity within the encrypted messages.

“Meta has failed to provide assurances that they will keep their platforms safe from sickening abusers,” said Suella Braverman, the UK home secretary. “They must develop appropriate safeguards to sit alongside their plans for end-to-end encryption. I have been clear time and time again, I am not willing to compromise on child safety.”

Last month, Meta published a blog stating it was “on track” to make end-to-end encryption the default setting for “one-to-one friends and family chats on Messenger” by the end of 2023.

“The overwhelming majority of Brits already rely on apps that use encryption to keep them safe from hackers, fraudsters and criminals. We don’t think people want us reading their private messages so have spent the last five years developing robust safety measures to prevent, detect and combat abuse while maintaining online security,” a Meta spokesperson said.

The company added that it would be publishing an updated report, setting out a number of safety measures, such as restricting people over 19 from messaging teens who don’t follow them and using technology to identify and take action against malicious behaviour.

“As we roll out end-to-end encryption, we expect to continue providing more reports to law enforcement than our peers due to our industry leading work on keeping people safe,” the spokesperson said.


Follow Us

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.
Show Comments