Click here to complete our Survey about Human Rights after Brexit

#HRC47 DAY 10: How to stop the torrent of disinformation without cutting off the flow of freedom of expression

How can states build a bridge to cross the digital divide to an open, fair and safe environment online while fighting the flood of disinformation? It was the conundrum under consideration at the 19th meeting of the 47th session of the Human Rights Council (HRC) on 2 July 2021. 

Irene Khan, Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression explored the issues in her presentation of the Report on Disinformation and Freedom of Opinion and Expression.

The report, originally published in April 2021, examined the current threats posed by disinformation to human rights, democratic institutions and development processes.

Ms Khan described disinformation as “false or misleading information disseminated intentionally to cause serious social harm” and went on to label it as “interacting with political, social and economic grievances in the real world … undermining freedom of expression and democratic institutions, polarising political debates, fuelling public distrust and endangering human rights, public health and sustainable development.”

Describing digital technology as what has “enabled pathways for false and manipulated information to be created, disseminated and amplified at scale by various actors of political, ideological or commercial motives”, Ms Khan highlighted how disinformation was being used to,

 “attack women, minorities, migrant and other marginalised communities, journalists, human rights defenders and political opponents. The impact on individuals, communities and institutions is real and deeply disturbing.”

In her report, Irene Khan criticized the responses by states and companies as problematic, inadequate and detrimental to human rights. However she was unequivocal in her findings that,  

 “access to diverse and reliable information sources, free, independent and diverse media, digital literacy and smart regulation of social media are the obvious antidotes to disinformation. Multifaceted multi-stakeholder responses, grounded in international human rights law, are the most effective way of building resilience against disinformation.”

 

State Responses 

State responses vary greatly, from disrupting the internet and censoring social media platforms, over using different types of laws, to supporting digital literacy to counter disinformation.

Ms Khan noted that “States must refrain from sponsoring or spreading disinformation” which enraged some delegations;

China: “We urge countries concerned to immediately stop fabricating and spreading disinformation and refrain from using human rights as a political tool. We also ask the relevant special procedure mandate holders to respect the authoritative information provided by the governance of states so as to avoid being used by those with ulterior motives … countries continue to spread false information and lies based on their ideological and political purpose to attack other countries’ social systems. They politicise and stigmatize the epidemic.”

Venezuela: “We reject those interventionist policies carried out by certain powers that promote the manipulation of information for political purposes, including the financing of communication media under their control to influence media agendas that they themselves are the drivers and the target countries of the global south.”

Syrian Republic: “The attitude of the British government institutions against the Syrian Republic over years by producing false media and material, broadcasting them through various platforms and training terrorists, while projecting them as humanitarian and media activists is an example of a state policy based on the art of lying. We do not agree with a report that this information is a consequence of societal crisis, and a breakdown of public trust in institutions.”

Russia: “We cannot agree with a conclusion that disinformation thrives where human rights are restricted, and public information regimes are weak. We have been witnessing the introduction of excessive restrictions, which infringe the rights to expression, particularly in countries of so-called advanced democracies.”

USA in response to Russia: “In Russia, the government uses the foreign agent law to target and harass independent media and civil society. Globally, governments use network restrictions or internet shutdowns to suppress the free and open exchange of information online. How can stakeholders best work together to counter disinformation while protecting freedom of expression online and offline?”

Some states even sponsor disinformation for political and strategic aims. The report mentioned examples of state-led disinformation in Myanmar relating to the Rohingya crisis, in the Philippines as well as generally in relation to the COVID-19 pandemic. Internet shutdowns are a widespread phenomenon in many countries before or during elections, but also to prevent violence and the spread of “fake news” during demonstrations and political unrest. However, the HRC and the report condemn internet shutdowns that intentionally and arbitrarily prevent or disrupt access to information online: 

“Shutting down the Internet is an inherently disproportionate response, given the blanket nature of the act, which blocks multiple other uses of the Internet. (…) In many cases, they appear to be aimed at silencing minority voices and depriving them of access to vital information.”

Using criminal law to curb disinformation has not been welcomed by the Special Rapporteur either. Given that existing laws, such as defamation, consumer protection or fraud provisions, already cover a lot of illegal activity, laws criminalizing the spreading of false information are often disproportionate or overly vague. They tend to fail the tests of legality, necessity and legitimate aims required by restrictions of freedom of opinion and expression. 

The report also warned states against granting authorities excessive discretionary powers or putting platforms themselves in charge of regulating online content: 

“The trend that sees States delegating to online platforms ‘speech police’ functions that traditionally belong to the courts has continued. The risk with such laws is that intermediaries are likely to err on the side of caution and ‘over-remove’ content for fear of being sanctioned.”

The Special Rapporteur was concerned by freedom of expression being curtailed through vague and discriminatory laws.

“Some governments have used these laws against journalists, political opponents and human rights defenders. Not only are such measures incompatible with international human rights law, they do little to combat disinformation.”

IOHR, in collaboration with the International Association of Human Rights Advocacy in Geneva and the Global Alliance against FGM, submitted an oral statement to the session to remind the Human Rights Council delegates of the continued use of anti-terrorism laws to arrest journalists in Turkey. The statement called on Turkey to align with their own constitution and international human rights standards and free all media professionals charged as terrorists simply for exercising their right to freedom of opinion and expression.

The Czech Republic delegation also supported this view against using draconian laws to target media professionals stating,

 “Using ambiguous vague and globally defined national security laws and similar legal acts against journalists and political opponents is a blatant violation of freedom of expression, guaranteed by the ICCPR.”

As well as the Canadian representative,

 “We note with concern the increasing use of fake news laws to criminalise journalists for doing their jobs. We condemn the use of state sponsored trolling and disinformation to silence journalists, disproportionately affecting those with marginalised and intersecting identities. This is unacceptable.”

Only a few positive emerging trends were noted by the report. One was regulatory measures that focus on transparency and due process, such as the European Union’s draft digital services act. The other one was laws that restrict the amplification of illegal or harmful content, with clear and precise language and without infringing free speech by only suppressing undue reach. Current bills in this regard are introduced in France, Brazil and the US. 

 The delegation from Canada summed up the sentiment of several states:

“Digital technologies and internet connectivity have been transformative for how people share information. At the same time, they’ve been used to proliferate disinformation at an unprecedented speed and scale, leading to significant harms to communities and individuals that are discriminated against or marginalised.”

 

Business Responses and Obligations

As well as focusing on state actors, the Special Rapporteur acknowledged businesses – such as digital platforms like Twitter and Facebook – have taken ‘some positive action’ to reduce the dissemination of fake news on their platforms.  However Ms Khan deemed these actions as,

simply not enough to make a meaningful difference in the absence of a serious review of the business model that underpins much of the drivers of disinformation and misinformation.”

The report highlighted how many business models currently thrive off the promotion of sensational content. The Special Rapporteur noted:

“The global disinformation system is a highly lucrative business that is driven by commercial motives and that is becoming increasingly professionalized. Technology companies are also purportedly allowing spreaders of misinformation to monetize their content.”

Ms Khan continued this theme in her speech to the delegates,

“company responses to disinformation have been reactive, inadequate and opaque. Algorithms targeted advertising and data harvesting practices of the largest social media companies appear to be driving users towards extremist content and conspiracy theories in ways that feed and amplify disinformation.”

Businesses do not have the same human rights obligations as states. However, they are expected to conduct their operations in line with the UN’s Guiding Principles on Business and Human Rights. Picking up one of the central themes of the HRC 47th session the Special Rapporteur highlighted that,

 “neither states nor companies have done enough to address online gender disinformation that targets women, particularly women journalists, politicians, human rights defenders and gender advocates, in order to drive them out of public life. Women’s right to be free from violence and harassment must be insured online as well as offline.”

 The report acknowledged that many of the largest social media companies have already adopted a range of policies and tools to counter misinformation, including; banning “fake news”; applying labels to content regarded as harmful or misleading; closing accounts deemed to be undermining the integrity of the platform; and establishing fact checking programmes. 

 

Advertisement-driven business models

The report identifies a number of human rights issues with advertisement-driven business models, whereby algorithms harvest user data to better target adverts. 

Issues identified with advertisement-driven business models are; that it can drive users towards “extremist” content which undermines the right to form an opinion and freedom of expression; that systematic collection of users activities and targeted advertising can violate their right to freedom of opinion; the lack of transparency might infringe on the right to privacy; finally there is evidence that suggests the recording of people’s private thoughts could be used against them by malign actors in a discriminatory manner. 

 Despite all of this, it is “not clear whether social media platforms have sought to review their business model as part of their human rights due diligence efforts.”

The Special Rapporteur concluded that:

“Companies should review their advertising models to ensure that they do not adversely impact diversity of opinions and ideas and are clear on the criteria used for targeted advertising. They should provide meaningful information about advertisers in online advertisement repositories and give users the choice to opt in to be exposed to advertising.”

 

Application of rules

Social media companies rely on a wide range of content policies to tackle disinformation. However, the report notes that the “applicable rules can be hard to find, strewn as they often are across various parts of the companies’ websites in community standards, policies, leadership statements, newsrooms, product information pages and business help centres.”

As well as this, definitions are often overly broad, leading to over censorship. This problem is exacerbated by over-reliance on automated filters that “are unable to capture nuance or understand context.” The Special Rapporteur points out that it is crucial to incorporate human involvement in content removal decisions, concluding:

“Companies should adopt clear, narrowly defined content and advertising policies on disinformation and misinformation that are in line with international human rights law and after consultation with all relevant stakeholders. (…) They should ensure that all policies are easily accessible and understandable by users and are enforced consistently, taking into account the particular contexts in which they are applied.”

 

Remedies

Where companies have taken wrongful actions to tackle disinformation, they continue to fail to provide adequate remedies. Appeals mechanisms for wrongful decisions are “imperative” to protect freedom of expression. However, in many cases, the Special Rapporteur has found that the current appeals mechanisms fall short. Moreover, “it is unclear whether appeals mechanisms are available in a range of languages.”

 The Special Rapporteur also suggested that “proposals for third-party oversight bodies can be a valuable means of strengthening remedies”. The report pointed toward the Facebook Oversight Board stating: 

 “While it is too early to assess its effectiveness, it should be evaluated in due course through a transparent, multi-stakeholder participatory process, as it could yield valuable lessons for the sector.”

 

Geographical disparities

It was also noted that there was significant disparity between policies, even within the same company, based upon geographical location. While the United States benefits from special election centres and voting information centres, civic space is undermined in developing countries. 

The document draws focus to the fact that, by virtue of the largest companies being heavily influenced by the US and European politics / public opinion, “ they do not invest sufficient resources in understanding the local factors that feed disinformation online in other parts of the world, especially developing countries.” 

 

Political pressure

The report drew out a number of issues in how companies approach public figures; companies inconsistently applied their terms of service to block legitimate content while simultaneously allowing potentially misleading and harmful content to be published by other public figures. 

The absence of clear policies might result in companies being vulnerable to pressure to clamp down on legitimate political speech and facilitate state-instigated disinformation.

The Special Rapporteur recommended: 

 “They should adopt clear policies relating to public figures that are consistent with international human rights standards and apply them consistently across geographical areas.”

 

Transparency, accountability and access to data 

In no uncertain terms the report asserted that the “lack of transparency and access to data continue to be the major failings of companies across almost all the concerns in relation to disinformation and misinformation.”

Whilst some social media platforms produce transparency reports, “they do not share more precise and meaningful information about action taken to address disinformation or misinformation.”

The Special Rapporteur was at pains to highlight the contributions she had received for her report from 120 civil society organisations from all around the world, large global organisations and also small grassroots organisations. IOHR was one of 95 different civil society organisations that were accepted to give an oral statement and Ms Khan, in answer to the criticism of bias from some states, “urge[ed] governments to listen to civil society and to give them the space to be able to have consultations with you. That will only enrich your own responses.”

She continued to ask states that 

“they have to respect, to protect and to fulfil freedom of opinion and expression. Respect, by not restricting it unlawfully, whether it is through internet shutdowns or laws on fake news (…) but respect also includes responding to their own obligation to ensure public information, and to protect diverse independent pluralistic media.”

Ms Khan implored  all states to 

 “close the digital divide, to enhance access, but at the same time also to educate people about digital technology and digital literacy. Digital technology is the lifeblood for modern economy and modern society, and we should treat it very carefully in those terms. (…) Freedom of expression is a very valuable right, it is as valuable for sustainable development, as it is for democracy.”

Closing her session the Special Rapporteur reflected on her mandate, 

“It is important for economic, social and cultural rights, as well as civil and political rights, and it is the lubricant in the human rights system. And let us therefore respect freedom of expression and opinion, and see it as the means for addressing this dark side of the internet.”

Sign up to our Newsletter

Enter your name and email address below to receive regular updates from us

Sign up to our Newsletter

Enter your name and email address below to receive regular updates from us