Geo Focus: The United Kingdom
,
Geo-Specific
,
Governance & Risk Management
UK Parliamentary Committee Says the Agency Is Not Likely to Meet the 2025 Deadline
The U.K. telecom regulatory faces “significant challenges,” in implementing the newly passed Online Safety Act intended to protect children from online harm, a parliamentary committee said.
See Also: User Entity & Behavior Analytics 101: Strategies to Detect Unusual Security Behaviors
Analysis by the House of Commons Committee of Public Accounts found the Office of Communications – Ofcom – faces challenges that could push the comprehensive rollout of an enacting rule at least a year after the proposed enforcement the deadline of 2025.
The regulation imposes a duty of care onto online platforms to shield young users from pornographic or self-harm content and the potential of criminal prosecution for those whose send harmful or threatening communications.
The regulation empowers the regulator to order online intermediaries, including chat apps such as WhatsApp and search engines such as Google, to identify outlawed content. It was signed into law last year despite criticism on its potential adverse impact to privacy (see: Tech Companies on Precipice of UK Online Safety Bill).
In Tuesday’s report, the parliamentary committee said that Ofcom lacks clarity on how to process data of nearly 100,000 service providers that came under the scope of the regulation.
Although the regulator is tasked with developing an automated compliance monitoring system, the committee said the agency has not yet finalized the mechanism – negatively affecting its ability to address individual complaints due to the sheer scale of complaints.
“Ofcom faces significant challenges about how it will engage with, supervise and regulate providers based overseas,” the report said. “There is a risk that public confidence in the regime will be undermined if it does not quickly bring about tangible changes to people’s online experience.”
Privacy groups, as well as companies including WhatsApp and Apple, have long argued against provisions requiring the service providers to perform content scanning to identify harmful materials under the law (see: UK Online Safety Bill Harms Privacy & Security, Experts Say).
Since a majority of instant messaging apps use end-to-encryption, the companies argued that content scanning would require them to weaken existing encryption, which would put their customers at increased risk of mass surveillance and hacking.
Pam Cowburn, head of communication and campaigns at Open Rights Group, said the committee report shows the Online Safety Act is an “overblown legislative mess.” Lawmakers rushed to finalize the regulation last year without adequate scrutiny, she said.
“There was very little detail of how wide-ranging powers would be enacted in practice. Parliamentarians have given Ofcom an almost impossible task in asking them to implement this bad piece of legislation,” Cowburn said.
Despite the British lawmakers amending the scope of the regulation to allow only the use of scanning tools that meet “minimum standards of accuracy,” Cowburn said privacy remains as an issue online intermediaries still will have to carry out client-side content scanning.
Ofcom did not immediately respond to a request for comment.