Ofcom Pressed to Clarify Online Content Rules After Palestine Action Ban Overturned
Ofcom Urged to Clarify Online Rules After Palestine Action Ruling

Rights Groups Demand Clarity from Ofcom Following Court Ruling on Palestine Action Ban

Human rights organizations, academic experts, and prominent writers have issued an urgent call for Ofcom to clarify the implications of a recent High Court decision that declared the ban on Palestine Action unlawful. This demand comes as the Home Secretary prepares to appeal the judgment, creating significant uncertainty about how online platforms should handle content related to the direct action group.

Confusion Over Online Safety Duties and Content Removal

The Metropolitan Police has already announced that officers will no longer arrest individuals at protests who express support for Palestine Action. However, the signatories of a formal letter to Ofcom argue that the situation remains dangerously unclear for digital platforms. These companies face legal obligations under the Online Safety Act to remove terrorist content promptly, yet the court's ruling has thrown the status of Palestine Action material into question.

Open Rights Group, Amnesty International UK, Big Brother Watch, Access Now, and several other organizations have specifically asked the communications regulator to provide immediate guidance. They seek clarification on whether platforms are still expected to remove content associated with Palestine Action, how new terrorist content removal duties will be implemented, and whether previously removed content should be restored if the government loses its appeal.

Vague Definitions and Censorship Concerns

Sara Chitseko, the pre-crime programme manager at Open Rights Group, emphasized the broader risks at play. "The UK's vague definition of terrorism and legal duties under the Online Safety Act already risk content being wrongly defined as illegal and removed," Chitseko stated. "Now there is additional confusion over whether tech companies are targeting and removing online content relating to Palestine Action."

Chitseko further argued that "in light of the court's judgment and commentary on freedom of expression, Ofcom need to provide immediate guidance to ensure that important public debates about Palestine are not being censored."

Legal Limbo and Platform Responsibilities

Last week, judges decided that the proscription order banning Palestine Action under anti-terrorism laws would remain in place pending Shabana Mahmood's appeal against the High Court's decision. This creates a contradictory legal position where content supportive of Palestine Action must technically be removed when platforms discover it or receive reports, despite the court finding the ban itself unlawful.

The signatories to the letter, which also include Statewatch, Netpol, Article 19, and forensic computer expert Duncan Campbell, urge Ofcom to follow the Metropolitan Police's example by clarifying the situation during the appeal process. They warn that this issue will become even more urgent when new requirements to proactively scan for illegal content, restrict livestreaming, and suppress algorithms come into effect later this year as expected.

Escalating Content Removal and Algorithmic Suppression

The letter highlights that the proscription of Palestine Action "raised serious concerns about the criminalisation of political expression" and notes there has been a significant escalation in content removals across major platforms including Instagram, TikTok, and X. This includes the use of algorithms to hide Palestine solidarity posts and cases where individuals have faced police action for expressing political views online.

"The High Court ruling should be a turning point," the letter asserts. "It demonstrates how easily counter-terror powers and platform regulation can be used to silence debate and suppress dissent and how difficult it is to undo those harms once systems of censorship and surveillance are put in place."

Ofcom's Response and Ongoing Uncertainty

When contacted for comment, Ofcom did not directly address the specific situation regarding Palestine Action content pending the appeal. A spokesperson stated: "Under the Online Safety Act, tech firms must take down illegal terrorist content swiftly when they become aware of it. There's no requirement on sites and apps to restrict legal content for adult users. In fact, in carrying out their duties to keep people safe, the act requires platforms to have particular regard to the importance of protecting users' right to freedom of expression."

This response has done little to alleviate concerns among rights groups, who continue to press for specific guidance that addresses the unique circumstances created by the court's ruling and the pending government appeal.