The UK Government and Online Harms Regulator ofcom Disagree About Whtherra Misinformation is covered by the UK's Online Safety Act (OSA).
On 29 April 2025, The Commons Science, Innovation and Technology Committee (SITC) Questioned The UK's Online Harms and Data Regulators about Whiteer The Uk's Online Safety Act (Osa) purpose, as part of its inquiry into online Misinformation and harmful algorithms,
As with Previous Sessions, MUCH OF THE Discussion Focused on the Spread of Disinformation during the Soutport riots in 2024During the session, The Sitc also grilled government minister barones about the implementation of the osa, which will get into effect on 17 March 2025. The application of the legislation to Online Misinformation and Disinformation.
Mark Bunting, The Director of Online Safety Strategy Delivery at ofcom, For Example, said that when the osa contains provisions provisions to set up an advisory committee on disinformation to information to inform Work, the osa itself contains no provisions to deal with disinformation.
During the Previous Sitc Session, in which the committee grilled x (Formerly Twitter), Tiktok and metaEach of the firms contend that they alredy have processes and systems in place to deal with disinformation cries, and that the osa would therefore not have made a not Have
Bunting Added that While The OSA DOS NOS NOS NOT COVER MISINFORMATHE DIRECTY, It Did “Introduce the New Offense of False Communications with An Intent to Cause Harm, Where Companies have comeonable groups is intent to cause harm ”.
Committee Chair Chi Onwurah, however, said it would be different to prove this intent, and highlighted that there are no duties on of duties on of tax action over misinformation, ever there are codes about Misinformation Risks.
Jones, however, contend that misinformation and disinformation are covered by the Osa, and that it would have made a “Material difference” IF Its Provisions ARONGAL HARMS WORENS The Soutport Riots.
“Our interpretation of the act is Misinformation and Disinformation are covered under the illegal Harms Code and the Children's Code,” She Told MPS.
Talitha Rowland, The Department for Science, Innovation and Technology's (DSIT) Director for Security and online harm, added that it can be challenging to determine the threshold for Illegal decision, beCause it can be so broadly defined: “It can sometimes be Illegal, it can be foreign interference, it can be content that incites hate or vioolence that thats that's clea Illegal threshold, but nevertheles be harmful to children – that is captured. “
In the wake of the riots, Ofcom did warn that social media firms will be obliged by the osa To deal with disinformation and content that is hateful or provokes violence, noting that it “will put new duties on tech firms to protect their users from Illegal content, who is the actual content Involving hatred, disorder, provoking violence or certain instals of disinformation ”.
Bunting concluded that platforms themselves want clarity over how to deal with disinformation with disinformation with their services, and that ofcom will continue to monitor case law developments around how the OSA CAN BEND The context of Misinformation, and Update Future Guidance Accordingly.
Updating the Sitc on the Progress Made Since the Act Went INTO Force on 17 March, Bunting said that ofcom has received Around 60 Safety Assessments from from Platforms about the Rescues of Various HARMS OCCURIS their platforms. These are required to demonstrate to of how they are tacking Illegal harms and proactive work to find and remove such content.
Initially Published 16 December 2024The Risk Assessment is the first step to compliance with ofcom's Illegal harms codes and guidance,
The codes outline various safety measures providers must put in place, which inclinating a Senior Executive to Be Accountable For Osa Compliance; Properly Funding and Staffing Content Moderation Teams; Improving algorithmic testing to limit the spread of illegal content; And Removing Accounts that are either run by or on behalf of terrcerist organisations.
Companies at Risk of Hosting Such Content must also proactive detect child sexual exploitation and abuse (CSEA) Material Using Advanced Tools, Such as Automated Hash-Matching.
Ofcom Previous said it will be holding a further consultation in spring 2025 to expand the codes, which will increase looking at proposals on banning accounts Response protocols for emergency events such as the August 2024 Riots in Englandand The use of “Hash Matching” to Prevent the sharing of non-consuce imagery and terrcerist content,