Click here to complete our Survey about Human Rights after Brexit

Cyber vigilance and the human rights threats of the UK Online Safety Bill

On 29 June 2021, Izumi Nakamitsu, head of the UN Office for Disarmament Affairs (ODA) told the Security Council in an Estonian-led meeting focused on security in cyberspace, that “we must remain vigilant” in the face of malicious technologies that “could imperil the security of future generations”. She warned that:

“Digital technologies are increasingly straining existing legal, humanitarian and ethical norms, non-proliferation, international stability, and peace and security.”

The UN’s concerns are not unfounded; according to the ODA chief, by 2022 roughly 28.5 billion networked devices will be connected to the internet, an increase of over 10 billion since 2017. As cyberspace is becoming more and more prevalent in daily life, the security of digital data and the pervasiveness of online surveillance has become a global concern. Consequently, governments have started introducing and implementing laws that aim to regulate cyberspace. However, finding the balance between necessary internet regulations and overstepping online control is a difficult task.  

Most recently, the UK’s proposed draft Bill for Online Safety has raised concerns about its freedom of speech implications. Civil liberties groups, academics, and the tech industry have identified several gaps in the legislation which would hand over disproportionate powers to Oliver Dowden, the UK’s Culture Secretary, and impose a duty of care on digital service providers in relation to identifying “harmful content”.

While the bill has good intentions – such as protecting internet users from illegal and harmful online content, protecting children from sexual exploitation and abuse, and promoting media literacy, its vague nature and lack of clear legal definitions bears the danger of unduly restricting freedom of expression online.

The bill runs the risk of over-censorship by encouraging platforms to moderate user-generated content. It puts providers of regulated user-to-user services or search services at the core of deciding what content can stay online while threatening them with heavy penalties. Since harmful content is described as the provider having “reasonable grounds to believe that the nature of the content is such that there is a material risk of the content having, or indirectly having, a significant adverse physical or psychological impact on a child (adult) of ordinary sensibilities”, providers will most likely choose the lowest common denominator to play it safe and to avoid spending resources and time on the content moderation. Interpreting and applying ambiguous terms like “indirectly harmful” content or “significant” impact on a person would be left to the provider to decide. This has the potential to enforce unequal standards, while also infringing on individual rights, and interrupting technological innovation. While making these judgement calls, providers are also asked to regard the importance of free speech and privacy. Hence, the burden to decide which content is journalistic, political, or harmful will fall mainly on companies driven by commercial interests, rather than by protecting digital rights. 

The independent Office of Communications (OFCOM) will act as an online safety regulator and will be able to fine providers of up to £18 million or 10% of the provider’s annual global turnover for not meeting their duties. Other enforcement powers will include enforcement notices, technology warning notices, business disruption measures and possibly a senior manager’s criminal liability. As the regulator, OFCOM will also have the power to block access to sites. 

The Committee to Protect Journalists (CPJ) criticized this approach:  

“It is for parliament to determine what is sufficiently harmful that it should not be allowed, not for OFCOM or individual platforms to guess.”

To help providers make decisions about online content, OFCOM would be in charge of drafting a Code of Practice, which needs approval by the government ministers and the UK parliament. This means that, in effect, the bill would allow Oliver Dowden to modify the code of practice to ensure it reflects government policy, a move that could potentially undermine the regulator’s independence and politicise internet regulation. Ben Greenstone, managing director of Taso Advisory and former principal advisor to the minister with responsibility for online harms, has pointed out that the proposed Bill would provide the Secretary of State with “unprecedented power to direct an independent regulator”. This creates uncertainty for both individuals and online businesses, and could undermine the democratic process.

Additionally, the Secretary of State for Digital, Culture, Media and Sport will also have the power to exempt services or remove exemptions from the new law. Heather Burns, policy manager at the Open Rights Group, commented on the proposed Bill:

“The notion that a political appointee will have the unilateral power to alter the legal boundaries of free speech based on the political whims of the moment frankly makes the blood run cold.”

The rules for what is acceptable online cannot depend on one political figure. A shifting definition could have alarming effects on the freedom of expression.

A DCMS spokesperson has responded to the criticism:

“Our world-leading laws will place clear and robust duties on in-scope companies and Ofcom to uphold and protect people’s free speech while making sure they do not over-remove content. The bill has been designed with suitable and transparent checks and balances so that Ofcom’s implementation of the laws delivers on the policy objectives decided and scrutinised by a democratically elected parliament.”

However, a group of digital security NGOs have identified further points of concern in the draft Bill, particularly relating to end-to-end encryption. The Bill contains clauses that could in some cases prohibit the use of end-to-end encryption, leaving UK citizens with less online security than citizens of other democracies. Furthermore, British online businesses will have less protection for their data flows in London than in the United States or the European Union. The group of digital rights NGOs, including ARTICLE 19, Big Brother Watch, Bikal, COADEC, Blacknight, and the Association for Proper Internet Governance, called on the Home Office to explain:

“how it plans to protect the British public from criminals online when it is taking away the very tools that keep the public safe. If the draft Online Safety Bill aims to make us safer, end-to-end encryption should not be threatened or undermined by this legislation.”

Effectively, with this new bill commercial organizations overseen by a state regulator would be in charge of deciding upon which content should be protected, and which content needs to be removed. While amendments to the law controlling cyberspace are both necessary and overdue, any new legislation must ensure that the human rights of individuals are maintained and that internet regulation is not being politicised. 

 

Sign up to our Newsletter

Enter your name and email address below to receive regular updates from us

Sign up to our Newsletter

Enter your name and email address below to receive regular updates from us