Australian Government to introduce new disinformation and misinformation laws

On 21 March 2022, the Australian Government announced its plan to introduce new legislation to combat harmful disinformation and misinformation online and released a report prepared by ACMA in June 2021 regarding existing disinformation and misinformation regulation.

Reform Announcement

In the second half of 2022, the Australian Federal Parliament is expected to introduce legislation to expand the Australian Communications and Media Authority (ACMA)’s powers so as to hold big tech companies accountable for harmful content on their platforms.

This announcement follows:

  • the introduction of the voluntary and industry-led Australian Code of Practice on Disinformation and Misinformation (Code), in February 2021, which has since been adopted by eight digital platforms, including Google, Facebook, Microsoft, Twitter, TikTok, Redbubble, Apple and Adobe;  

  • the presentation of ACMA’s report on the on the adequacy of digital platforms’ disinformation and news quality measures (including on the Code) to the Government in June 2021 (released to the public on Monday); and

  • the consideration by the Government of the additional measures put in place by industry to combat harmful misinformation and disinformation in relation to COVID-19 and the recent Russian invasion of Ukraine.

Although the Government acknowledged the “positive steps taken by industry”, it has stated that “more protections must be provided to Australians online.” Consultation is expected to occur regarding such protections in the coming weeks, however at this stage they are likely to comprise:

  • empowering ACMA with new information-gathering powers (including powers to make record keeping rules) to incentivise greater platform transparency and improve access to Australia-specific data on the effectiveness of measures to address disinformation and misinformation;

  • empowering ACMA with reserve powers to register and enforce industry codes or make industry standards; and 

  • establishing a Misinformation and Disinformation Action Group (including participants from both the public and private sector) designed to collaborate and share information on emerging issues and best practice responses to disinformation and misinformation.

Key concerns from the ACMA Report

Some of the key concerns regarding misinformation and disinformation generally, which ACMA highlighted in its report, include:

  • the fact that 82% of Australian adults have experienced misinformation about COVID-19 in the last 18 months, with 22% of these individuals experiencing ‘a lot’ or ‘a great deal’ of misinformation online;

  • The ACMA said that Australians are most likely to see misinformation on larger digital platforms (e.g. Facebook and Twitter) but that smaller private messaging apps and alternative social media services (e.g. Telegram, Gab, Parler, Rumble, and MeWe) are also increasingly used to spread misinformation or conspiracies due to their less restrictive content moderation policies.

The ACMA also expressed concerns about the Code, including:

  • that its effectiveness is limited by an excessively narrow definition or interpretation of harm. Currently, signatories to the Code are only required to act against content if it is reasonably likely to result in ‘serious’ and ‘imminent’ harm. Such a requirement could result in a narrow interpretation that “would likely exclude a range of chronic harms that can result from the cumulative effect of misinformation over time, such as reductions in community cohesion and a lessening of trust in public institutions.” Accordingly, ACMA recommended that “imminent” should therefore be removed from the definition of harm in the Code;

  • that it could be strengthened through an op-out rather than an opt-in model, as signatories “should only be permitted to opt out of outcomes where that outcome is not relevant to their service and be required to provide justification for the decision.” 

  • recommending that private messaging services be included within the scope of the Code, with appropriate caveats to protect user privacy to provide important consumer protections; and

  • that the Code does not oblige individual signatories to have robust internal complaints processes.

Other Potential Reform

Separately, the Senate Select Committee on Foreign Interference through Social Media is inquiring into the use of social media for purposes that undermine Australian democracy and values, including the spread of misinformation. In its Interim Report released in December 2021, the Committee raised concerns about the spread of COVID-19 misinformation and disinformation. 

The Committee’s final report is due to be published by May 2022 and will likely also make significant recommendations for the regulation of online disinformation and misinformation.

Latest insights

More Insights
cipa

Management of royalties received by inventors employed in Hungary

Apr 23 2024

Read More
Stethoscope and keyboard on blue background

IP Symposium Ireland - The Podcast

Apr 23 2024

Read More
test tubes with coloured liquid

Regulating non-therapeutic human experimentation: a new legal framework in Belgium

Apr 22 2024

Read More