Skip to main content Link Menu Expand (external link) Document Search Copy Copied

Edit this page

Regulating Private Intermediaries

  1. Freedom of Speech Online
  2. Racism, misogyny and abuse: the internet has problems
  3. State Intervention
    1. Private Regulation and The Power of Platforms
    2. Platform Responsibility: Imposing Obligations on Private Intermediaries
Help needed!

This section is incomplete. Please help out by filling in some details.

Freedom of Speech Online

Overview of Online Freedom of Speech Issues by Nic Suzor

The great potential the internet brings is to democratise speech. It provides the ability for ordinary people to be heard by massive audiences. When Time Magazine named ‘You’ the person of the year in 2006, it showed a sense of great optimism in the democratic potential of the internet. The optimistic view is that the internet provides the ability for ordinary people to be heard by massive audiences. This was never the case with broadcast or mass media.

Article 19 of the ICCPR protects the freedom to seek and impart information. In many cases, the freedom that the internet provides seems almost unlimited. Stewart Brand famously said in 1984, ‘information wants to be free’. Dan Gilmore explained that regulating speech on the internet was extremely difficult - ‘the net treats censorship as damage and routes around it’. However, there are conflicts and a series of difficult issues that we will explore below.

Racism, misogyny and abuse: the internet has problems

**Lucinda Nelson on racism on social media in 2020

Social media has made spreading abusive content easier, but regulating it more difficult. Activists have long demanded action from social media platforms to address hateful content that they help distribute online. Three of the central, overarching demands are: increased transparency; more proactive efforts; and greater engagement with experts and marginalised populations.

Although there have been some positive responses to these demands, social media platforms have been reluctant to make changes that effectively address abuse.

State Intervention

There are serious problems with direct state intervention and abuse of state-created rules. There are ongoing debates about the extent to which states should regulate speech online. In Australia, the Federal Government sought to introduce a filtering system that would restrict access to speech that was ‘offensive’, including speech that was not illegal. In many countries around the world, governments are requiring online intermediaries to censor information in ways that likely violate Art 19. Russia, for example, has leant on Twitter to block pro-Ukranian activists; Turkey has required Twitter to block content within Turkey from anti-government protestors.

There are also serious problems with Notice-and-takedown. Notice-and-takedown is a state-created response to a need to provide effective ways to police the internet, but it leaves open massive potential for abuse. There’s a serious procedural problem here: intermediaries are threatened with liability if they don’t remove infringing content, but they’re not in a position to know whether the material is protected under law. They’re not courts, and can’t really legitimately make this decision.

Private Regulation and The Power of Platforms

Private organisations are largely responsible for determining what information we can communicate and seek. Intermediaries like Google control what information turns up in search engines. Social networks like Facebook make important decisions about what information we see. So, for example, Facebook has admitted to manipulating the content of news feeds to drive changes in mood. Because Facebook’s goal is to sell advertising, they have a strong incentive to show us the most profitable content. This is a huge amount of power over human thought.

All of these companies also have standards that determine what content is acceptable. So, for example, male nipples are OK on Facebook, but not female nipples. Pictures of beheadings are OK, but not pictures of mothers breastfeeding. Pictures of marijuana, but not other drugs.

These private organisations are increasingly important in how we access information, but they are not bound by constitutional protections of free speech.

Platform Responsibility: Imposing Obligations on Private Intermediaries

Video Overview of the UN Guiding Principles on Business and Human Rights by the Danish Institute for Human Rights

The new gatekeepers are private actors who have the power to control speech but no real responsibilities. So we don’t know when they’re censoring speech, on the one hand, and on the other, there is a lot of abusive speech, hate speech, and vilification that is difficult to respond to or control.

There are ongoing battles to try to get more transparency about how private entities make decisions. These new gatekeepers are under increasing pressure to justify the policies they make and the way those policies are enforced.

This becomes even more important when we look at the conflict between freedom of speech and other legitimate legal rights. For example, there are many who complain that private companies do not do enough to limit the spread of hate speech. In recent years, there has been a lot of publicity about gender-based hate speech in particular, but there are serious questions about how well the social networks that control speech encourage and protect minority viewpoints in general.

If we think of freedom of speech not just as a negative right to be free from overt state interference, but as a thicker substantive right to maintain and express one’s opinions, there is a real conflict here. Minority voices are being drowned out by abuse1 or silenced by algorithms, filters, and moderators with inbuilt majoritarian biases.

This represents a key tension between the right of freedom of expression, and the ability to actually enforce legal rules and social norms. Private intermediaries are increasingly being asked to do more, but they don’t have the legitimate authority of courts. If they don’t do more, though, people get hurt. Finding a way to balance these tensions is one of the key challenges for regulating the internet.

The United Nations Guiding Principles on Business and Human Rights explains that business have a responsibility to respond to human rights abuses with which they are involved. Civil society groups are increasingly seeking to get intermediaries to cooperate. Using the language of ‘responsibility’, they’re trying to drive change in the policies of platforms.

This proceeds both in terms of seeking more transparency and accountability when platforms censor information or hand over personal details on behalf of governments, and also when platforms remove, or refuse to remove, content that violates their terms of service. Across many of these services, people are also increasingly looking for technical ways to achieve regulation (see, for example, Twitter’s block lists).

  1. For an interesting overview of hate speech in online gaming platforms, see this video by GAMBIT: http://video.mit.edu/watch/gambit-hate-speech-video-7031/