Content Regulation and Online Classification

Video Overview of Online Content Regulation in Australia by Nicolas Suzor

Australia has a co-regulatory content regulation scheme. Telecommunications providers work together to develop industry codes of practice. These industry codes are overseen by the Australian Communications and Media Authority (ACMA) under the Broadcasting Services Act 1992 (Cth) Schedules 5 and 7. Schedule 5 applies to material hosted outside of Australia, and Schedule 7 applies to material hosted inside Australia. Australia also has an Office of the eSafety Commissioner, who has powers to investigate and address complaints about internet content. The Communications Alliance is the industry body that represents ISPs and Content Hosts.

Video Overview of Australia's classification Ratings byEmily Rees

The National Classification Code provides a statement of purpose that classification decisions are to give effect, as far as possible, to the following principles: (a) adults should be able to read, hear and see what they want; (b) minors should be protected from material likely to harm or disturb them; (c) everyone should be protected from exposure to unsolicited material that they find offensive; (d) the need to take account of community concerns about: (i) depictions that condone or incite violence, particularly sexual violence; and (ii) the portrayal of persons in a demeaning manner.

Publications, Films, and Computer games are rated by the Classification Board, according to the Classification Guidelines. Each State and Territory determines the consequences of classification. The ratings systems differ by media:

  • Films: G, PG, M, MA15+, R18+, X18+
  • Publications: Unrestricted, Unrestricted (M), Category 1 Restricted, Category 2 Restricted
  • Games: G, PG, M, MA15+, R18+

Classification Guidelines: R18+

  • High impact violence, simulated sex, drug use, nudity
  • No restrictions on language

Classification Guidelines: X18+

  • Real depictions of sexual intercourse and sexual activity between consenting adults
  • No depiction of violence or sexual violence
  • No sexually assaultive language
  • No consensual activities that ‘demean’ one of the participants
  • No fetishes (such as ‘body piercing’; candle wax; bondage; fisting; etc)
  • No depictions of anyone under 18, or of adults who look under 18.

Classification Guidelines: Refused Classification

“Publications that appear to purposefully debase or abuse for the enjoyment of readers/viewers, and which lack moral, artistic or other values to the extent that they offend against generally accepted standards of morality, decency and propriety will be classified ‘RC’.”

For films, anything that exceeds X18+ is Refused Classification. For Games, anything that exceeds R18+ is RC (A new R18+ category was introduced for Games in 2012).

Classification Guidelines: RC (Films)

  • Detailed instruction in crime or violence
  • Descriptions or depictions of child sexual abuse or any other exploitative or offensive descriptions or depictions involving a person who is, or appears to be, a child under 18 years.
  • Violence: Gratuitous, exploitative or offensive depictions of:
    • violence with a very high degree of impact or which are excessively frequent, prolonged or detailed;
    • cruelty or real violence which are very detailed or which have a high impact;
    • sexual violence.
  • Sexual activity: “Gratuitous, exploitative or offensive depictions of:
    • activity accompanied by fetishes or practices which are offensive or abhorrent;
    • incest fantasies or other fantasies which are offensive or abhorrent.”
  • Drug use:
    • Detailed instruction in the use of proscribed drugs.
    • Material promoting or encouraging proscribed drug use.

Diagram by Sophie Murdock

The online classification scheme is a complaints-based scheme. When ACMA receives a complaint, it applies the Guidelines to come to a preliminary view as to whether content is 'potential prohibited content'. If the content has not yet been classified, ACMA will obtain classification from the Classification Board.

Video Overview of the complaints system under Sch 7 by Hannah Burnett

Under Schedule 7, members of the public can make complaints about prohibited or potentially prohibited content 1), which the Commissioner must investigate. Content that is prohibited in Australia is defined as material that is or would be:2)

  • Refused Classification (content that exceeds the limits of all other categories. RC content is not necessarily unlawful to possess or access, but it is unlawful to broadcast, sell, or screen publicly)
  • X rated (films that depict sexually explicit content);
  • Rated R18+ (Films or computer games with high impact violence, sex scenes, and drug use that are high in impact) without a Restricted Access System; or
  • Audiovisual content that is Rated MA15+ ('strong content' that is legally restricted to persons 15 years and over), provided over a commercial service, and not subject to a restricted access system.

If prohibited content is available from servers with an ‘Australian connection’, the Commissioner must order the service provider to remove or stop serving the content. This is done in different ways for different types of content:

  • For hosted content, the Commissioner must serve the hosting service a ‘final take down notice’.3) * For live-streaming content, the Commissioner must give the provider a ‘final service-cessation notice’.4)
  • Where prohibited content is linked to by an Australian indexing service (or search engine), the Commissioner must give the provider a ‘final link-deletion notice’.5)

To determine the applicable rating, internet content is evaluated as if it were a ‘film’6) under the Classification Guidelines. If content is ‘potentially prohibited’, the Commissioner must issue an interim notice and apply to the Classification Board for classification.

'Restricted Access System'

Must comply with ACMA’s RAS Declaration 2007: (a) requires an application for access to R 18+ content; (b) provides warnings and safety information for R18+ content; (c) verifies the age of applicants; (d) limits access to R18+ content or R18+ and MA15+ content as required; (e) includes a risk analysis; (f) includes quality assurance measures for R 18+ content; (g) meets the record keeping requirements for R 18+ content.

Review to the AAT (cl 113)

Available as of right Only available to the service provider concerned. (Who is the service provider?)

The ACMA may replace an industry code of practice with new standards

See explanation by Nicholas Cooper

Where relevant industry codes have failed or are not adopted, the ACMA may replace these codes with their own standards. The ACMA's ability to replace an industry code with a standard varies dependent on the legislation that governs that industry.

Generally, the ACMA must request that an association representing an industry develop a replacement of a code within 120 days. The ACMA must publish this request in the federal gazette if no association exists within an industry. If the developed code fails in some capacity or no association is formed, the ACMA may replace an industry code with a standard. In some industries, the appropriate Minister may direct the ACMA to establish a standard to replace a code.

Video Overview of the Operation of Sch 5 for Internationally Hosted Content by Anon

Legislation provides for blocking of content if there is no industry code in force. However, there is a current industry code, and it does not require blocking. Under the code, URLs of prohibited content are given to voluntary filter vendors, but no further action is taken in regards to overseas hosted content.

Gab Red Explains How s 313 Is Used by Government Agencies to Block Websites

Matt Cartwright Explains the Recommendations of the Recent Inquiry Into the Use of s 313

Video overview of image-based abuse laws by Danielle Harris

The non-consensual sharing of intimate images is often colloquially referred to as 'revenge porn'. The term 'image-based abuse' is generally considered to be a more accurate term that avoids the victim-blaming connotations of 'revenge porn'.

The National Statement of Principles Relating to the Criminalisation of the Non-consensual Sharing of Intimate Images encouraged each Australian jurisdiction to adopt nationally consistent criminal offences.

The Commonwealth has inserted a new criminal offence which prohibits the posting, or threatening to post, non-consensual intimate images.7) It also introduced a complaints-based system whereby the eSafety Commissioner may issue a removal notice or another civil remedy upon receipt of a victim’s complaint.8)

Queensland extended the definition of ‘intimate’ images to include original or photoshopped still or moving images of a person engaged in intimate sexual activity; a person's bare genital or anal region; or a female, transgender or intersex person's breasts.9)

The State also introduced three new misdemeanours into their Criminal Code[6] to broaden the scope of conduct which is captured under the offence. These include distributing intimate images,10) observing or recording breaches of privacy,11) and distributing prohibited visual recordings.12)

Lauren Trickey explains how to make a complaint to the eSafety Commissioner

The eSafety Commissioner is a statutory office established by the Enhancing Online Safety Act 2015 (Cth) to promote and enhance online safety. The Commissioner can receive reports for cyber-bulling, image-based abuse or offensive and illegal content.

Cyberbullying complaints can be made by an Australian child or parent, guardian or person authorised by the child. Cyberbullying refers to online material intended to seriously threaten, intimidate, harass or humiliate an Australian child.

Image based-abuse complaints can be made by the person in the intimate image, a person authorised to make a report or a parent or guardian of a child or a person who does not have capacity. The Commissioner can only assist if the image is hosted in Australia or if the person in the image or the person who posted the image resides in Australia. Image based-abuse occurs where a person has shared or threatened to share an intimate image of another person, without their consent.

Australian residents can also report offensive or illegal content, which includes abhorrent violent material or material depicting illegal acts.

For each type of material an online form can be completed on the eSafety website. Each form requests information regarding what is contained or depicted in the material and where the material has been posted. After receiving a complaint, the Commissioner assesses the material complained of to determine the appropriate course of action, which may include liaising with the relevant platform for the material to be removed.

See overview by Georgie Vine about the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019

The Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019 creates new offences under the Criminal Code, effective from 6 April 2019. These new provisions require social media websites and other platforms to remove abhorrent violent material as soon as reasonably possible, and to refer it to the Australian Federal Police. These offences target content that is reasonably capable of being accessed with Australia, regardless of where the platform operator is located. The offences include substantial penalties if individuals and companies do not remove or report such classified material: fines up to 10% of annual global turnover for companies, and up to 3 years imprisonment for individuals.

There are a few defences to the new offences, including material distributed by journalists; material used for scientific, medical, academic or historical research; and the exhibition of artistic works.

Deepfakes

See an explanation of deepfakes by Eric Briese

A deepfake is a technique of video manipulation where artificial intelligence and deep learning is leveraged to modify the appearance of a person in a video to appear as someone else. While fake and edited videos are nothing new, this recent phenomenon has caused concern due to the ease and accessibility of their creation.

The concerns surrounding deep fakes stem from their ability to be used for all sorts of bad reason such as false evidence, blackmail, or for personal attacks. The potential harm that deepfakes can cause is also magnified due to the widespread use of the internet in our increasingly online interconnected world.

Deepfakes aren't explicitly illegal but depending on how they are used they may be captured under other laws. For instance, creating and spreading deepfake pornography is illegal under section 223 of the criminal code in Queensland.

Regulating deepfakes is a difficult challenge. One possibility to combat deepfakes is to create algorithms which leverage the same deep learning technology to automatically detect deepfakes, but this is not perfect. Companies are working towards finding viable solutions like by releasing collections of deepfakes to help researches and experts study this new phenomenon.


1)
cl 37
2)
cl 20
3)
cl 47
4)
cl 56
5)
cl 62
6)
cl 25
7)
Enhancing Online Safety (Non-Consensual Sharing of Intimate Images) Act 2018 (Cth) sch 2 s 4; Criminal Code Act 1995 (Cth) s 474.17A.
8)
Enhancing Online Safety (Non-Consensual Sharing of Intimate Images) Act 2018 (Cth) sch 1 s 24; Enhancing Online Safety Act 2015 (Cth) ss 19A, 27, 44D–44F.
9)
Criminal Code (Non-Consensual Sharing of Intimate Images) Amendment Bill 2018 (Qld) s 4; Criminal Code Act 1899 (Qld) s 207A.
10)
Criminal Code (Non-Consensual Sharing of Intimate Images) Amendment Bill 2018 (Qld) s 5; Criminal Code Act 1899 (Qld) s 223.
11)
Criminal Code (Non-Consensual Sharing of Intimate Images) Amendment Bill 2018 (Qld) s 6; Criminal Code Act 1899 (Qld) s 227A.
12)
Criminal Code (Non-Consensual Sharing of Intimate Images) Amendment Bill 2018 (Qld) s 7; Criminal Code Act 1899 (Qld) s 227B.
  • cyberlaw/content.txt
  • Last modified: 6 weeks ago
  • by nic