Video Overview by Nic SuzorCode is Law

Lawrence Lessig's famous phrase 'code is law',1) has been hugely influential in debates about internet governance and regulation. Lessig argues that the rules of the software that control online communication can be as powerful, or more powerful, than the legal rules of nation states in regulating behaviour. Lessig advances four key 'modalities of regulation' that regulate behaviours in different ways:

1. Law 2. Architecture 3. Market; and 4. Norms.

As law students, we are familiar with thinking of regulation primarily as law. However, as Lessig points out, the law is not the only form of regulation in everyday life. Lessig gives the example of the speed bump as a way to regulate traffic speed. A speed hump is an architectural, and arguably more cost-effective, solution to the problem of speeding than hard regulation in the form of laws.

Applying Lessig's Modalities of Regulation to the Online Environment

Overview by Nic Suzor:Examples of Regulation - Content and Copyright

Lessig outlines how the four types of regulation work in cyberspace:

  • Law: laws such as copyright, defamation and obscenity threaten ex-post sanction for violation of legal rights;
  • Norms: what you can say on particular websites is influenced by the nature of that site;
  • Markets: price structures or busy signals constrain access, areas of the web charge for access, advertisers reward popular sites, online services drop low-population forums; and
  • Architecture: software and hardware that make cyberspace what it is constrain how you can behave online, by requiring passwords, producing traces that link transactions to you, encryption and code.

We can apply these four modalities to different regulatory issues about internet content. Say, for example, we are concerned about offensive content on the web. Over a third of Australians think that there is too much offensive content on the internet. We could create a law against offensive material, but it would be really hard to enforce. In fact, we already have several such laws - we have common law obscenity offences, as well as a content classification scheme that allows people to make complaints about content online. These are all practically useless where content is hosted in foreign jurisdictions, though.

A code-based approach to regulating offensive content might be to introduce mandatory filtering at the ISP level.

A market-based approach might include subsidising voluntary filters that parents can install on their home internet connections.

Alternatively, we might investigate how to develop social norms around acceptable behaviour. This might be a bit harder to articulate, but we see this happening a lot in our society - think of the moral outrage, in all of the papers, including outraged quotes from the PMs office when someone defaces a memorial page on Facebook, for example. This is the work that creates a shared social norm about what content or behaviour is permissible and what is not.

Importantly, these modalities are never really independent - they all interact in interested ways. So, for example, the market is starting to respond to concerns about offensive content, and social network platforms like Facebook, YouTube, and Twitter are modifying their code to allow people to report or flag offensive content. This market-based initiative leverages code and social norms to regulate the massive amounts of material that are posted to these networks every day.

Case Study: YouTube's ContentID system

Overview by Alex McKay

Another example is copyright infringement. In the late 1990s, the copyright industries' answer to the problem Napster posed was to turn to the courts. The courts eventually held that Napster was liable for copyright infringement, and the service was shut down. When that didn't stop filesharing, the industries turned to marketing to try to create strong social norms against copying – you wouldn't steal a car, right? Over the last decade, working with YouTube and others, rightsholders have been able to develop new technologies to detect potential copyright infringement and deal with it automatically. YouTube's ContentID, for example, automatically detects when a person uses copyright music in their video, and copyright owners are presented with an easy choice to block access to the video, remove the soundtrack, leave it alone, or run ads alongside it. This has been a massively important tool for rightsholders. Finally, there have been some market innovations over the last few decades as well. Eventually, iTunes emerged to satisfy some of the demand music fans had to be able to get access to digital downloads in a cheap and easy way. Spotify and now Apple Music have gone further - providing fans with all-you-can-eat subscription so that they can enjoy the abundance that Napster brought, legally.

If you define 'regulation' as a concerted effort to change the way another person behaves, there are many different ways of achieving that goal. Lessig's point is law is only one of the ways to regulate. When we are thinking about internet regulation, we need to be aware of the ways in which behaviour can be altered, and the limits of any given modality.

Lessig's work is also really important to point out the hidden ways in which code regulates. This is something that often goes unnoticed - but in every piece of software, in every algorithm, there are hidden assumptions about how the world works or should work. Sometimes this is accidental - for example, many websites are inaccessible to people with print disabilities, because they are not designed with this user group in mind. It takes a lot of vigilance to ensure that technologies are developed in a way that does not unintentionally exclude or limit the access of certain groups of people. Other times, though, code acts in a much more sinister way. We have no real understanding of the algorithm that Facebook or Google use to determine which content is visible to us. The news items that popup in our feeds, or the results of our searches, are all determined according to a set of algorithms that are ultimately designed to further the interests of private corporations. These are powerful algorithms – powerful mechanisms of regulation that we really do not understand, and certainly do not know whether or how we should regulate their design or use.


1)
Lawrence Lessig, Code 2.0 (2006), pp 121–26 http://codev2.cc/download+remix/Lessig-Codev2.pdf
  • cyberlaw/code.1547696037.txt.gz
  • Last modified: 10 months ago
  • by witta