QUAY INSIGHTS

AUGUST 2023

Are Australia’s intellectual property laws up to the task in a world of generative AI?

In a recent speech, the Minister for Communications, the Hon Michelle Rowland MP, spoke of how governments, industry and civil society continue to grapple with questions of how to effectively regulate digital platforms. The Minister recognised the challenge of ensuring that regulation achieves its stated aims, without stifling diversity, choice and innovation. These issues are important in the context of the regulation of artificial intelligence (AI), including in the way Australia addresses the intellectual property issues raised by rapid innovations in generative AI.

In her speech to the International Institute of Communications Telecommunications and Media Forum (IIC Forum) on 15 August 2023,¹ the Minister for Communications drew attention to the Australian Government’s current consultation in relation to the Safe and Responsible Use of AI. That consultation, which focusses on the proposed implementation of an appropriate governance framework, is complementary to other work the Government is undertaking in relation to AI regulation.

One of the complementary initiatives mentioned by the Minister for Communications is being undertaken by the Attorney-General, the Hon Mark Dreyfus KC MP. The Attorney-General is currently consulting in relation to the intellectual property issues that arise from AI, particularly generative AI, through a Ministerial Roundtable.

Generative AI generates output, typically content but also recommendations or decisions, in response to user questions or prompts. As set out in the Australian Chief Scientist’s Rapid Response Information Report Generative AI: Language models and multimodal foundation models,² that supports the consultation on the Safe and Responsible Use of AI, generative AI is based on large language models (LLMs) and multimodal foundation models (MFMs). LLMs and MFMs are trained on vast quantities of, in the case of LLMs, text and, in the case of MFMs, a wider range of content, including speech and other audio content. Intellectual property rights are therefore a key issue in the context of generative AI.

The first meeting of the Ministerial Roundtable was held in February 2023 and was attended by 30 organisations from a wide range of sectors with an interest in copyright, including publishing, broadcasting, screen, education, research, music, gaming, technology and cultural collections. It was agreed at the meeting that a priority issue for consideration would be the implications of AI for copyright law, particularly in the context of text and data mining, database protection and authorship of AI created works.³ Intellectual property rights were not discussed at the second meeting of the Ministerial Roundtable that was held in June 2023, though the Attorney-General has promised this issue will be considered at a further meeting, to be held later in the year.

There are two related issues relevant to copyright in the context of generative AI. The first, as indicated above, arises from the use of copyright content to train generative AI. While it is known that LLMs and MFMs have been trained on content that has been sourced from the internet, there is very little transparency as to what content has been used by any particular model. It may be assumed that the content of Australian media companies, as well as many Australian artists, authors, photographers, musicians and other creatives has been used, but there is simply no way to definitely determine this.

The second issue relates to the use of copyright content in the responses that generative AI, including generative AI chatbots such as ChatGPT, provide to user questions. Not only is there is no transparency as to the content that is used to train LLMs and MFMs, there is also very limited transparency as to the source of content that is directly used to respond to user questions. Evidence from, for example, ChatGPT is that it does use journalistic content to respond to users of that service. This second issue raises similar questions as were raised in the context of the use of Australian news media content by digital platforms, which resulted in the implementation of the groundbreaking News Media and Digital Platforms Mandatory Bargaining Code.

At the IIC Forum, Shadow Communications Minister the Hon David Coleman MP pointed out that Europe is already engaging in efforts to protect copyright in the context of AI, including to address the issues outlined above. The Shadow Minister spoke about the fact that media industry leaders had highlighted to him the lack of compensation for creators and investors in content which is harvested by generative AI. The Shadow Minister called for the Government to put in place intellectual property protections not just for the Australian media sector, but for other creative sectors. He did not provide a definitive solution, but suggested that the communications regulator, the Australian Communications and Media Authority, may have a role to play and also suggested that – as has been proposed by former ACCC Chair, Rod Sims – there is the potential for the News Media and Digital Platforms Mandatory Bargaining Code to be expanded in scope to address generative AI.

While there may not yet be a clear answer as to how to ensure that content creators, including Australian media companies, as well as Australian artists, authors, photographers, musicians and other creatives, are adequately compensated for the use of their content by generative AI, it seems clear that Australia’s Copyright Act 1968 will be the subject of scrutiny as to whether it is providing adequate protection.

[1] Click here

[2] Click here

[3] Minutes available here

David Poddard

Dave Poddar

Partner

Quay Law Partners
Level 32, 180 George Street,
Sydney NSW 2000
T +61 422 800 415
E [email protected]
www.quaylaw.com

Angela Flannery

Angela Flannery

Partner

Quay Law Partners
Level 32, 180 George Street,
Sydney NSW 2000
T +61 419 489 093
E [email protected]
www.quaylaw.com