UK authorities will be granted power to order tech companies to redesign their platforms and impose fines if they fail to police child sexual abuse material under new online safety legislation.
The rules will target end-to-end encrypted platforms, where messages can only be viewed by the sender and recipient, which are under increasing political pressure to give governments and law enforcement access to content, including messages, photos and videos.
The Home Office announced the amendment to the online safety bill on Wednesday, allowing communications regulator Ofcom to fine tech companies £18mn or 10 percent of their annual turnover, whichever is higher, if they do not meet child protection standards that are yet to be defined.
Under the proposals, the regulator could be allowed to order tech companies to install yet-to-be-developed software into encrypted platforms, or develop their own technologies to detect inappropriate material.
The move comes as tech companies seek to strike a balance between safeguarding the privacy of their users’ data with protecting vulnerable users, while working with law enforcement and legislators who are unable to view content on encrypted platforms.
Apple has already attempted to introduce scanning software to crack down on harmful images of child sex abuse but was forced to row back after a fierce backlash from privacy campaigners last year.
Meanwhile, Meta, which owns Facebook, Instagram and WhatsApp, has committed to rolling out end-to-end encryption on Facebook Messenger, something that the Home Office and charities already have lobbied against in the name of child safety.
In a public submission to the bill committee last month, the company said it had concerns about how Ofcom’s ability to require message scanning for inappropriate material would work. “It is unclear how this would be possible in an encrypted messaging service, and would have significant privacy, security and safety implications for users,” wrote Richard Earley, Meta UK’s public policy manager.
Under the legislation, Ofcom will decide whether platforms are doing enough to prevent, detect and remove explicit material, and whether it is necessary and proportionate to ask platforms to change their products.
“Privacy and security are not mutually exclusive — we need both, and we can have both and that is what this amendment delivers,” home secretary Priti Patel said.
The government has awarded five projects across the UK more than £550,000 to develop technologies to stop the spread of child abuse material, which platforms could be instructed to use in their products in the future.
These include external software that can be integrated into existing encrypted platforms, as well as age verification technology that could be used before consumers access encrypted services.
Figures released by children’s charity the NSPCC on Wednesday suggested that online grooming crimes have jumped more than 80 percent in four years in the UK and average around 120 offenses a week.
Meta-owned platforms were used in 38 percent of cases where the means of communication was known and Snapchat 33 percent.