Writing in The Times, Partner Ernest Aduwa examines the Law Commission’s recommendations on the reform of contempt of court laws and discusses how big tech companies must also be held to account when it comes to harmful online content.
Ernest’s article was published in The Times, 4 December 2025.
The Law Commission’s recommendations on contempt of court are a thoughtful and necessary modernisation of our justice system.
As a criminal lawyer, I regularly see the clash between fair trial rights and the public’s right to know. Current rules are often too rigid, creating a vacuum of official information that is instantly filled with damaging speculation. Defining active proceedings from the point of charge, rather than arrest, is therefore a welcome change. It sets a clearer line for the police and the media, and should help prevent the kind of dangerous information void we saw after the Southport tragedy.
The strength of the Commission’s report is its refusal to create absolute rules – recognising that context is everything. Publishing a suspect’s name and age is generally harmless, but the same cannot automatically be said for their immigration status or religion. This places a new responsibility on police forces to make careful, case specific judgments, supported with strong legal advice. The goal is to replace misinformation with facts, not with other forms of potentially prejudicial data.
However, this legal reform only tackles one part of the problem Southport exposed: the deliberate and malicious spread of misinformation.
While the Commission’s proposals aim to starve the rumour mill of oxygen, we must also confront those who knowingly poison public discourse.
When misinformation leads to riots or threats of violence, the law’s response must be robust and meaningful. It is paramount that there are tougher punishments for this specific behaviour, with clear aggravating features. The scale of the harm must be a central factor.
Crucially, the law must also consider the status and influence of the speaker. An anonymous individual posting misinformed comments should not be treated the same as a public figure with a massive platform who deliberately incites a mob. The latter represents a far greater abuse and a more serious threat to public order.
Ultimately, creating new offences is a reactive step, dealing with the problem only after the damage is done.Instead, we must also look proactively at the platforms that enable this spread. The internet has become an unruly playground where anonymity often provides a licence for harm without consequence. People would think twice if their digital identity was linked to their real world identity and financial footprint. There is no compelling reason why internet providers cannot implement more forceful regulation of their services. They profit from engagement, but they must be held responsible for the preventable harm that unfolds on their platforms.
The Law Commission’s work is a vital piece of the puzzle, but we must also create a system where there is a significant price to pay for those who pollute public discourse – and for the companies that give them the megaphone.