The battle to change Australia's classification laws is on - now, it's time to see what everyone has to say.
82 submissions have been made public in total, featuring replies from public bodies like ACMA, the Australian Council on Children and the Media, SBS, the Classification Board themselves, corporations like Amazon and Google, various video game developers and their representative lobbyists IGEA, and telcos like Telstra.
Below you'll find a collation of what all the main players had to say. I've kept most of the replies abridged, because in some cases - like the Classification Board - the submissions run for more than 70 pages. But the most common thread is a call for a new classification to be introduced between the PG and M ratings, PG13, as well as a broader push to have classification duties reassigned, either to other existing organisations like ACMA, into the hands of industry or into the formation of a single independent oversight body.
The Classification Board
The Board is supportive of the harmonisation of the classification of media content regardless of media type or delivery. This includes support for:
• A single set of statutory guidelines for the classification of all media content. This would necessitate the same classification categories and classifiable elements being applied to all content and remove the current differences and anomalies that exist between the Guidelines for the Classification of Films, the Guidelines for the Classification of Computer Games and the Guidelines for the Classification of Publications.
• The clarification of the G, PG, M and MA 15+ classification categories.
• The addition of a new classification category for content that sits between the PG and M categories.
• The clarification and simplification of consumer advice. The Board recognises the need for further industry self-classification and is broadly supportive of a move towards establishing a multi-faceted classification process. Oversight by a single, independent Government regulator is required, in order to protect the Australian public from content that may harm or disturb them. The Board’s current functions and future role would form part of this entity.
This would include:
• Setting the standards of Australian classification and driving consistency in decision-making by industry and digital tools, thereby maintaining public confidence across a harmonised classification system.
• Undertaking the classification of some content at first instance, for commercial and law enforcement applicants.
• Undertaking auditing and benchmarking of classification decisions made by industry classifiers and digital tools, and varying and revoking these classification decisions and classifying this content.
• Directing the development and refinement of digital tools in order to generate classification decisions that are applicable and consistent with Australian classification standards and the expectations of Australian consumers.
• Providing training and accreditation of industry classifiers and assessors and auditing their performance and directing remedial action.
• Undertaking the current role and functions of the Classification Review Board, as well as operating as the review body for all other decisions made under any new scheme. Any increased use of industry self-classification will require greater emphasis to be placed on governance and risk mitigation. An independent Board would undertake auditing of classification decisions prior to them gaining force of law. This necessitates future workflow processes providing sufficient lead time to allow for the Board to audit and amend classifications as necessary, prior to the public being exposed to content that has been incorrectly classified. Classification Board Submission to the Review of Australian classification regulation
The Board supports the expectation of Australian consumers that there will be restrictions on the publishing of some content, with access to some content restricted by age. The Board supports the retention and enforcement of action for offences in relation to selling, screening, distributing or advertising certain categories of material whether classified or unclassified. Accordingly, any new Commonwealth classification legislation should provide penalties for failure to classify content or for failure to adequately apply the classification laws and Guidelines.
Australian Communications and Media Authority
The ACR paper notes that, consistent with the ACCC’s Digital Platforms Inquiry (ACCC DPI) final report, there is an opportunity for a new classification framework to enable industry to self-classify content across all platforms, overseen by an Australian Government regulator. We support this approach and propose that, within a single, federated classification framework, there is scope for more than one body to administer the scheme.
Under a new federated model, a consistent and contemporary classification code would be applied to content delivered over any platform. Industry would have greater ability to self-classify commercially provided content with government regulatory oversight of industry schemes and compliance.
To achieve this, the ACMA supports an approach that would see: > industry manage self-classification of its own content across platforms (similar to the current broadcasting arrangements), including the ability to automate classification processes;
> the eSafety Commissioner continue to assess and take specialised enforcement action in relation to illegal and harmful online content; and
> the ACMA administer general online and offline content classification review functions, except for material considered as illegal and harmful. This role would include oversight of associated industry self-classification arrangements and electronic classification tools, as appropriate.
This forms part of the work announced in the Government Response and Implementation Roadmap for the Digital Platforms Inquiry (12 December 2019) that the Government will release an options paper co-authored by Screen Australia and the Australian Communications and Media Authority that will look at how to best support Australian stories on our screens in a modern, multi-platform environment.
A federated model could also accommodate retaining the current Classification Board which has a limited role in the classification of films, games and publications. However, it would be more efficient for the classification of commercially provided content, regardless of platform, to have oversight by a single regulator, the ACMA.
This would reduce the costs to industry and government of the current arrangements while retaining the features of the Board such as statutory independence. The Classification Review Board could also be dissolved with review rights being available through the ACMA, as is currently the case for many of its other regulatory functions.
Further consideration should be given to treatment of ‘seriously harmful content’ and RC, X18+ and MA15+ classified material The OSLR paper contemplates a new ‘Class 1’ category of ‘seriously harmful content’.
The ACMA suggests that rather than creating a new category of content, it may be preferable to integrate the regulation of that type of content into the harmonised classification scheme. The eSafety Commissioner could retain responsibility for determining whether this content meets the definition of ‘Class 1’ content and take appropriate action without referring the content to the Classification Board. This would appear to address the key concern that such material needs to be dealt with more quickly than current arrangements allow. The OSLR paper also contemplates that ‘Class 2’ content, including material classified as RC, X18+ and MA15+, would also be overseen by the eSafety Commissioner. In practice, this could lead to areas of duplication, with different agencies overseeing compliance for the same content on different platforms.
Our final comments relate to the proposal to re-define the terms “film” and “computer games” in the Classification (Publications Films and Computer Games) Act 1995 (Classification Act), which technically cover a broad range of online content.
We agree that the definitions in the Classification Act should be made clearer in terms of the content that must be classified, and recommend that any revised definitions of “film” and “computer games” should retain the current advertisement exclusions (sections 5 and s.5A(3) of the Classification Act), such that an advertisement for a publication, film or computer game is not required to be classified.
In conclusion, the current advertising self-regulation system provides an effective, transparent and robust mechanism for consumers to raise concerns about the content of film and game advertisements.
Transferring the complaint management process to Ad Standards would effectively establish a ‘one-stop-shop’ for the community and industry, and harmonise the regulatory framework in respect of film and game advertising.
This would greatly improve clarity and consistency for consumers and would provide a single point of contact for those responsible for advertising those products and services across all media (broadcast and online). Additionally, it provides a robust, independent and fair system for assessing whether or not an advertisement meets the broader community’s standards.
We would be pleased to discuss our submission further with the Department, and how our proposal for the recognition of the AANA Codes and the referral of complaints to Ad Standards could be implemented into the classification framework.
While we do not have any substantial comments on most aspects of the Discussion Paper of the Review, we would like to record our support for an amendment or clarification of the definition of ‘film’ in the context of classification of online material. The current definition of ‘film’ covers content and videos on websites/services such as YouTube, i.e. the definition includes so-called User Generated Content (UGC).
Consequently, as all films and computer games must be classified before they are made available in Australia (unless they are exempt), such content technically may require classification under the current arrangements. Given the sheer volume of UGC, a requirement to classify such content is impractical. UGC should, therefore, not be included in the definition of ‘film’.
Collective Shout is an organisation combating the objectification of women and sexualisation of girls in media, advertising and popular culture, and recently supported Senator Stirling Griff's remarks calling for an immediate review into anime and manga in Australia.
A crossbencher in the Australian Senate has called for a review of all manga and anime classified in Australia to screen out the glorification of child sexual abuse, specifically citing Eromanga Sensei for depicting "wide-eyed children" in "explicit sexual activities".
We make the following recommendations:
● Replace the current system with an evidence-based and age-based classification system.
● Content that should be classified includes all professionally produced content for exhibition or distribution via all delivery formats (television, cinema, DVD, streaming, computer games).
● Broaden the provisions relating to ‘sex’ to reflect new research insights into sexual objectification.
● Any regulatory body (the existing or a new body) should be required to consult the international research along with child and youth development experts to ascertain the possible impact of content with sexualised content or messaging on this audience.
● ‘Adult magazines’ should continue to be classified, as well as being restricted to adults.
● If self-classification is to be introduced, it must be strongly regulated by the government using approved classification tools, overseen by a single regulator with powers of enforcement.
● Pornography should no longer be treated by default as ‘adult content’, but as commercialised sexual exploitation.
● Reliance on parents to control what their children access is unrealistic.
● Ensure that computer games continue to be classified taking into account evidence of harms of sexual objectification, and ensure compliance.
● An urgent investigation needs to be conducted into the Classification Board assigning M or MA15+ ratings to anime and manga genres featuring Child Sexual Abuse Material contrary to Australian law.
Australian Home Entertainment Distributors Association
As the ACCC pointed out in its report, new media and platform entrants enjoy lower regulatory costs and burdens over traditional film and TV distribution businesses which are in decline and facing enormous costs pressures.
The quantum of the costs from fees to classify content is as follows (page 15 of Discussion Paper):
• Films and episodic series on DVD and Blu-Ray, in cinemas, and online streaming services apart from Netflix must be classified by the Board for a fee.
Example: 600-minute series on DVD, Blu-ray or a video on demand service: the application cost is $2530 and under statutory timeframes it can take up to 20 working days for a classification decision. An additional fee of $420 can be paid for priority processing for a classification decision to be made within five working days.
125-minute film in cinemas: the application cost is $2760 and under statutory timeframes it can take up to 20 working days for a classification decision.
• Review of a decision: If an applicant does not agree with the classification decision by the Board, a review by the Review Board costs $10,000 unless the fee is waived. One member was recently charged $900 for a 16 minute “making of” on a movie from 1940. It is increasingly the case that without a level playing field for physical home entertainment distribution to support the retail sector, those disc titles with niche or a limited market size will not be cost effective to be released in the Australian market.
We look forward to the Government supporting recommendations of previous reviews – and we hope of this review - and importantly see the recommendations implemented. AHEDA strongly supports a move to self-classification by industry under oversight of an appropriate Government body (for example, the ACMA). As an interim reform, we also seek urgent adoption of the classification authorisation online tool for film that has been developed by the Department of Communications, with strong support including testing by industry. This will provide timely downward pressure in compliance and regulatory costs for AHEDA members.
Australian Council on Children and the Media
To begin with the most general issues, the provisions should address the question of ‘accommodating’ content at a particular level, ideally specifying that classification is the result of a balancing process between the matters listed in s11 of the Act, and there is no presumption in favour of a lower category. A second helpful step would be to clarify that the intent of the item’s author (designer, writer, director etc) is irrelevant to the classification decision. This would hopefully put an end to any future use of the concept of ‘paying homage’ in classifiers’ reasoning.
Some matters should be included in the provisions so that classifiers have an explicit mandate to consider them, for example whether violent scenes are arousing; whether violence is (un)punished or rewarded or appears heroic; the amount of harm the violence appears (likely) to cause to the onscreen victim; and the use of weapons. Familiarity and genre should be expressly excluded from the provisions.
As we have seen, the former concept seems to reinforce the moralistic antecedents to the system, by implicitly casting the system as a means of protecting innocence and avoiding corruption; and it also seems to contradict the notion that repetition of exposure to violent content enhances impact and does not lessen it. The second concept, on examination, is irrelevant to any matter currently referred to in the provisions, as well as to the risk of negative impacts (as identified in the research evidence). Indeed, certain genres might enhance the risk because they tend to glamorise violence.
A number of matters need to be clarified in the provisions. For example the concept of ‘justified by context’ should be clarified to mean necessary to the plot or character development, not morally justified, and certainly not justified by the genre of the film or game (see above). ‘Realistic’ should be defined to mean ‘like real life’; rather than the opposite of ‘stylised’, it should mean the opposite of ‘implausible and unlikely’. The question might be whether one can imagine such violence happening in such a situation in real life, and not whether it is presented in a naturalistic way. Also it should be made clear that the concepts of ‘level of detail’ and ‘accentuation techniques’ are relevant where they contribute to realism. It should not be possible to read them in such a way as to give a lower classification to items that fail to show the consequences of violence.
Although this study has gathered some research evidence on the types of media violence that enhance the risk of negative impacts, the preponderance of evidence is about the types of people at greater risk of negative impacts from exposure to media violence. Even some of the research referred to here is based on the way that individuals interpreted content, so conclusions about the propensity of the content itself are a function of how viewers or players are likely to respond to it (e.g., see Anderson et al., 2003). From the point of view of supporting the development of public policy on media violence, research differentiating between types of user is of limited use. It is true that classification law has traditionally distinguished between people of different ages, saying those above a certain age can access certain content and those below cannot.
Such distinctions may have a foundation in research (depending on the age thresholds used and the way they are justified). However, this is probably the limit of the political acceptability of public policy that distinguishes between groups of people. Other distinctions drawn in the research are unlikely ever to be used for the development of public policy. For example, research shows that aggressive temperament and exposure to violence at home heighten the effects of media violence (Valkenburg & Piotrowksi, 2017). While such insights might enable psychologists to help parents and carers understand the need to limit exposure to violent content generally for certain children, they are less helpful in informing a system to restrict exposure at a population level. The people who enforce such restrictions – cinema operators and the like – cannot be expected to check for such factors when deciding whether to grant access to certain content. Further research about content factors is needed to advance the development of public policy on access to violent media content.
Australian Children's Television Foundation
In some international jurisdictions, the regulator has introduced a category that indicates to consumers that the program is classified as an educational program or is ‘exempt’ such as the category E in Canada. This E classification indicates that, even though a program does not contain content that may be disturbing for young children, the program itself is still not necessarily intended as a children’s program. An E category paired with an age recommendation may be beneficial for consumers to better assess the suitability of a program.
ACTF recommends to either add to the PG classification by introducing additional age brackets (e.g. 0+, 8+ and 12+) and implement a rating system (e.g. G, G8+ and G12+) that indicates the suitability of the content for these age groups more clearly, or to abolish the PG category altogether such as in the German system and replace it with a rating system that indicates the appropriate age group as described above.
We agree with the proposed future definition of classifiable content, which essentially covers professionally produced content for exhibition or distribution via theatrical, broadcast, streaming, download-to-own, DVD or other commercial means. It would be impractical to include User Generated Content published on YouTube and other social media platforms, which is (or should be) subject to the take down provisions and regulatory regimes that apply to those platforms.
The ACTF recommends that:
• A single new classification framework should be overseen by the ACMA (the regulator);
• The regulator should be responsible for the provision of robust and consistent classification tools and training for industry;
• All platforms (broadcasters, distributors, streamers) should be able to self-classify where they have trained classifiers to do so;
• If an entity releasing a title does not have trained classifiers they would need to have that content classified by an ACMA approved classifier using the same classification tools and overseen by the regulator;
• Material should not need to be classified twice. If a trained classifier at a broadcaster classifies content, that classification should be sufficient on all other forms of release of that content;
• The reasons for classification decisions should be transparent and readily available and there should be a process whereby a classification can be disputed.
The majority of Google's submission was largely clarification around what Google does on YouTube and the Play Store, although there were some broader comments regarding user generated content and the current classification framework.
The Australian Law Reform Commission (ALRC) review into Classification in 2012 acknowledged limited community expectation that user generated content should be classified and proposed that any new Act should only require classification if it is both: made and distributed on a commercial basis, and likely to have a significant Australian audience. We think this observation holds true today and urge the review team to adopt the ALRC’s recommendation.
Given how freely content is moving across State and Territory borders, both physically and digitally, we question the practicality of a cooperative scheme and suggest that one single framework administered at a Federal level is more appropriate for the modern content environment.
Accurate classifications - and ensuring our customers have the best possible information about content they are considering watching - is central to our corporate philosophy. Amazon is committed to creating safe and reliable online viewing spaces for families and subscribers. We support the need for a harmonised classification system.
The content available to Australian consumers continues to grow through the expansion of streaming services as well as diversified content offerings from broadcasters and telecommunications providers. While no single body can keep up with the classification demands created by this volume of content, timely classification is essential to ensure that Australians are able to watch globally popular original content at the same time as consumers around the globe.
Yet, the current regulatory framework has led to a high degree of uncertainty regarding the manner in which certain aspects of the system may or may not apply to streaming video and other ‘over the top’ services.
Self-classification is consistent with ensuring Australians get the information necessary to make informed decisions about the content that is appropriate for their household. The Department has already seen the potential for success of efficient self-classification approaches. We believe that there are significant benefits to consumers that flow from enabling content providers to self-rate content in accordance with local requirements for classification ratings and consumer advice. Industry can and should partner with the Department on how to ensure consistency in application, including coregulatory open lines of communication on ratings’ revisions and education on locally relevant considerations.
Interactive Games & Entertainment Association
The full submission is fairly detailed, and while I've included the finer points below, it's worth giving the paper a longer read.
We generally support the existing classification categories for video games with the exception of MA15+, although we also recognise that there are problems with PG and M. We recommend that MA15+ be merged with M into a non-restricted category which we discuss in detail in our response to question. We are aware that some stakeholders support a new category between PG and M, such as PG-12 or PG-13. While this is not a priority for our industry, we are happy to consider this further if needed.
We do not see the need for changes in any of the provisions in the Code or the Guidelines relating to the treatment of the classifiable element of themes. From our perspective, themes are currently being applied effectively and consistently with Australian community standards. Themes are one of the more challenging classifiable elements to define but the fact that the Guidelines are not overly proscriptive on themes, unlike how some of the other elements are addressed in the Guidelines, is positive. ‘Themes’ has a very broad scope and we believe that the flexibility of this category is its strength. Unlike other parts of the Guidelines, the treatment of themes is approached maturely and does not automatically assume that games are dangerous. On that, we also note the Department’s research that parents consider the portrayal of strong themes in media to have benefits for young people.
Violence is treated more harshly in video games than in films, even when interactivity has no impact. Violence is also treated more harshly in Australia than in most comparable jurisdictions around the world. We do not believe that the Guidelines’ treatment of violence reflects Australian community standards. We recommend that the Guidelines be amended so that, interactivity aside, similar or equivalent violent content in films and video games are treated equally.
We recommend that the Guidelines at the R18+ level be amended so that the same level of sexual activity that is permitted in films is also permitted in computer games. Any activity that is legal in the real world should be able to be legally depicted. We also recommend removing the specific rules around games with sex linked to incentives and rewards. This kind of content is more flexibly addressed through the overall consideration of interactivity in video games.
We recommend that the Guidelines at the M and MA15+ level be amended so that the same level of language that is permitted in films is also permitted in computer games.
We recommend that the Guidelines at the PG level be amended so that the same level of drug use that is permitted in films is also permitted in video games. The Guidelines should also clarify what is meant by ‘drugs’ and that the definition should exclude fictional drugs and medicines. We also call for the softening of the rule regarding interactive drug use, especially at the R18+ level, and the outright removal of the rule that causes drug use linked to incentives and rewards to be RC. Both of these aspects of drug use in video games are already being addressed through consideration of context and interactivity.
We recommend that the Guidelines at the G and PG level be amended so that the same standard of nudity is applied to both video games and films. We also recommend removing the specific rules around games with nudity linked to incentives and rewards. Video games will already be assessed more critically due to the Board’s requirement to consider the impact of interactivity and the context of the nudity.
We also recommend a move away from the Board’s current ‘free text’ approach to Consumer Advice (CA) towards a more consistent and standardised version of CA that will make it easier for classification tools and trained industry classifiers.
We support the ALRC Inquiry’s recommendation that video games likely to be classified G, PG or M could instead be subject to voluntary classification through an industry code. IGEA would be well placed to implement and administer this code.
We recommend that a new Scheme be changed to an entirely advisory system without legal access restrictions on any categories. In particular, we support removing legal access restrictions on MA15+ which the ALRC supported and ask whether it is now also the time to remove the problematic MA15+ category, merging it with the M non-restrictive category.
We believe it is also worth considering whether the access restrictions for R18+ content are still effective or helpful, particularly in the digital environment. While it may seem like a radical step, we note that legal age restrictions on accessing or buying films and computer games, both offline and online, is already highly uncommon around the world. Discussions and policy consideration around online age verification, both globally and in Australia, have so far been limited to the context of access to adult online content and the few attempts at implementation have generally not gone well.
Netflix is supportive of a flexible, harmonised approach to classification in Australia, which continues to support the capacity of organisations to self-rate content. Other providers may wish to use a different system, such as trained ‘in-house’ classifiers to watch content and provide an appropriate Australian classification and relevant consumer advice. It is important that the classification system is flexible enough to enable content providers to develop an approach to self-classification that works best for their business model, provided that such a system produces classifications and consumer advice that are reflective of Australian consumer standards and useful to Australian viewers.
Any such system should be supported by a complaints and oversight mechanism to ensure that Australians continue to receive accurate and localised information to inform the viewing choices of themselves and their families.
Netflix agrees that content providers should be required to provide Australians with a minimum standard of consumer advice (for example, to identify and advise consumers about the existence of a classifiable element present in a film or tv show). However, we believe the classification system should provide flexibility in the way consumer advice is provided, and not impose disincentives to provide the most accurate information to Australian consumers.
Telstra agrees that a harmonised, platform-neutral regulatory framework governing content production and delivery in Australia should be developed. It has long been recognised that Australia’s legislative, regulatory and policy frameworks in these areas are no longer fit for purpose, focused as they are on platform-specific regulation, with limited recognition of the shift to online services and the convergence of formerly disparate methods of content production and delivery. These outdated frameworks create market distortions through regulatory disparity; further, by seeking to regulate dynamic markets through static, prescriptive regimes, they are unlikely to be effective in achieving their original objectives.
With these considerations in mind, the consultation paper suggests that a new classification framework could be established to enable industry to self-classify content across all platforms, overseen by an Australian Government regulator. Industry self-classification could be done using trained assessors or approved classification tools, and monitored by the regulator to ensure the accurate and consistent application of classification guidelines.
As a final comment, we note our view that, in developing a new classification framework, it will be important to clearly and appropriately delineate responsibility for ensuring classifiable content is appropriately classified before it is made available in Australia. This is particularly so, given the increasing number of channels through which content is made available to consumers. In our view, obligations to classify content should rest primarily with the producers of that content, rather than third party aggregators or distributors. Content producers are likely to be most familiar with the aspects of their content bearing on classification. Further, this would help ensure consistency of classification in circumstances where the same piece of content is available on different services and across different platforms.
Are the classification categories for films and computer games still appropriate and useful? If not, how should they change? Yes—no compelling case to change. Current classifications are well understood by the community and SBS audiences, and there would be costs associated with change.
There is no need to change the current classification categories. The current categories are well understood by the Australian public. Any change to classification categories risks introducing audience confusion and will result in unnecessary significant compliance costs and implementation issues.
SBS supports the Australian Communications and Media Authority (ACMA’s) existing role in handling escalated complaints relating to broadcast content. Similarly, there should be a one-stop-shop for escalated complaints in any revised classification review system, whether that be for content provided via broadcast or online delivery. Duplication of regulatory oversight should be avoided, as should any framework that bypasses the content provider in dealing with complaints in the first instance. It is logical and appropriate for audiences and efficient for content providers that audiences should first complain to the provider before escalating concerns to a regulator.
Spherex is an American organisation that provides age ratings for clients to nearly 200 countries, and has rated more than 1,600 pieces of content for the Classification Board since 2016.
In our opinion, ‘themes’ as a classifiable element is not well understood by audiences. Theme is commonly misinterpreted as an indicator of genre or content. In Australia, when consumer advice cites ‘themes’ as an impacting element, it indicates ‘mature’ or ‘mild’ themes. For ‘themes’ to be an effective indicator, we recommend that the consumer advice explicitly calls out the impactful themes (e.g. suicide, death, occult practices, drinking, bullying) or themes that are specifically concerning to Australian audiences.
We further recommend that ‘horror’ be added to the list of the six classifiable elements. Horror often includes separate characteristics from violence such as disturbing or scary scenes. Horror also possesses the capacity to create fear in the minds of younger audiences. Currently, ‘horror’ appears to be accounted for partially in ‘themes’ and partially in ‘violence.’ Creating a new classification of ‘horror’ would highlight the distinctions while capturing the nuances.
Spherex advocates for the new classification framework to support self-classification by approved entities distributing content across digital platforms. Self-classification can be achieved either through a) Government-approved classification tools or b) trained industry personnel classifying content via a tool offered by the Classification Board. Oversight can be achieved by a single government regulator.
Disney supports the Film Industry Association’s submission to the Department including the introduction of a PG13 category in between PG and M. Disney supports the move to self-classification under a co-regulatory model for all delivery means and platforms as further described in the body of this submission. Disney is fully supportive of industry self-classification, by trained staff classifiers or other means, and for all delivery methods.
Nost of the submissions were fairly serious, and submitted from corporate bodies, companies that deal with the classification of content, or organisations with some skin in the game.
However, one submission took a much lighter approach to the review: