AI-Generated Content and Copyright: Who Owns What Your Startup Creates with AI?

AI-Generated Content and Copyright: Who Owns What Your Startup Creates with AI?

If your startup is using ChatGPT to draft marketing copy, Midjourney to create launch graphics, GitHub Copilot to write code, or any of the dozens of generative AI tools that have become standard parts of the modern startup toolkit, you have probably never stopped to ask the obvious legal question: who actually owns this output?

The instinctive answer — “we paid for the tool, so it’s ours” — is half right. The provider’s terms of service usually do assign whatever rights they have in the output to you. But under Australian copyright law, the more important question is whether there are any rights to assign in the first place. And the answer, in many cases, is no.

This article explains the current state of Australian copyright law as it applies to AI-generated content, why purely AI-generated work probably has no copyright at all, and what practical steps your startup should take to protect the things that matter.

The Human Authorship Requirement

Australian copyright law is governed by the Copyright Act 1968 (Cth). Copyright protection arises automatically when an original work is created — there is no registration system in Australia — but the work must satisfy several requirements. Two of those requirements are critical for AI-generated content.

First, the work must have an author who is a “qualified person” (broadly, an Australian citizen or resident, or in certain cases a citizen or resident of a country with which Australia has reciprocal arrangements). Second, the work must be original, which under Australian law means that it must be the product of “independent intellectual effort” by a human author.

These requirements have been tested in the Australian courts in cases involving computer-generated outputs, well before the rise of generative AI. The two most important decisions are Telstra Corporation Ltd v Phone Directories Company Pty Ltd (2010) 264 ALR 617 and Acohs Pty Ltd v Ucorp Pty Ltd (2012) 201 FCR 173.

In Telstra, the Full Federal Court held that copyright did not subsist in the White Pages and Yellow Pages directories because the directories were generated by automated systems, not by identifiable human authors exercising independent intellectual effort. The data collection and entry was done by Telstra employees, but the actual compilation — the structure of the directories themselves — was the output of computer programs. That was fatal to the copyright claim.

In Acohs, the Federal Court reached a similar conclusion in respect of material safety data sheets generated by a software system. The court rejected an argument that the consultant who developed the software and the employee who used it could be considered joint authors of the output, because their respective contributions were too separated in time, expertise, and content to constitute the “collaboration” required by the Act.

The principle that emerges from these cases is straightforward: if a work is produced by a computer program with no identifiable human author who has made an independent intellectual contribution to its expression, no copyright subsists. The Act fixes on the author, and where there is no human author, there is no copyright.

Applying This to Generative AI

Generative AI is a much more powerful version of the computer programs at issue in Telstra and Acohs, but the legal analysis is the same. When you type a prompt into ChatGPT or Midjourney and accept the output that appears, the question is whether you have made an independent intellectual contribution to the expression of the work — not whether you had an idea or directed the tool to produce something.

The position taken by IP Australia, by most Australian legal commentators, and by academic analysis is that purely AI-generated content — output produced from a prompt with no further human creative intervention — almost certainly does not attract copyright protection in Australia.

A short prompt is unlikely to constitute the kind of independent intellectual effort the cases require. The user’s contribution is the idea; the AI produces the expression. Copyright protects expression, not ideas. And the expression has been generated by a system, not authored by a human.

This does not mean that everything you produce with the help of AI is necessarily uncopyrightable. The more interesting question is where the line falls.

  • AI as tool, human as author. If you use AI to generate a draft and then heavily edit, restructure, and rewrite it, the final product reflects your independent intellectual effort. That is more likely to be a copyrightable work — and you are its author. The same logic applies to a developer using GitHub Copilot to autocomplete lines of code within a larger codebase that the developer has architected and written.
  • AI as collaborator, output as raw material. If you use AI to generate dozens of image options and then select, combine, and refine them, the curatorial and combinatorial work may be enough to give you copyright in the resulting compilation — but probably not in the underlying images themselves.
  • AI as author, human as prompter. If the workflow is “type a prompt, copy the output, ship it,” you almost certainly have no copyright. That includes most marketing copy, social media posts, blog articles, and stock-style imagery generated this way.

What the Provider Terms Actually Say

Most generative AI providers — OpenAI, Anthropic, Google, Microsoft, and others — include language in their terms of service that purports to “assign” or “transfer” all rights in the output to the user. OpenAI’s terms, for example, state that as between OpenAI and the user, the user owns the output and OpenAI assigns to the user any rights it may have in it.

This sounds reassuring, but it does not solve the underlying problem. You cannot assign a right that does not exist. If no copyright subsists in the output under Australian law because it has no human author, the assignment clause has nothing to operate on. You have a contractual confirmation that the provider is not claiming the work, but you do not have an enforceable copyright that you can assert against a third party who copies it.

The practical implication is significant. If your competitor scrapes your AI-generated marketing copy and republishes it word-for-word, you may have no copyright claim against them. You may have other claims — passing off, misleading and deceptive conduct under the Australian Consumer Law, or trade mark infringement if the copying involves your branding — but the straightforward copyright claim that would normally be available for original written work may simply not exist.

The Infringement Side of the Equation

Ownership is only half the picture. Startups using generative AI also need to think about whether the output itself might infringe someone else’s copyright.

The Australian government confirmed in October 2025 that it will not introduce a text and data mining (TDM) exception to the Copyright Act. That decision means there is no statutory carve-out for using copyrighted material to train AI models in Australia, leaving open the question of whether the training data underlying widely used commercial AI models was lawfully sourced — and whether outputs that closely reproduce that training data could amount to infringement when reproduced by users.

The risk is real but manageable. The most common scenarios are:

  • Verbatim or near-verbatim reproduction. Generative models occasionally regurgitate substantial portions of training data, particularly for prompts that target well-known content. If you publish such an output and it turns out to be a copy of someone else’s work, you may be liable for infringement regardless of whether you knew the source.
  • Style mimicry. Asking an AI to generate something “in the style of” a named artist or author is generally not an infringement of that person’s copyright (style is not protected), but it can create reputational and ethical issues, and may attract claims under consumer protection or moral rights provisions.
  • Code generation. AI coding assistants can output snippets that closely match open-source code from their training data — often without surfacing the licence under which that code was made available. That can introduce hidden GPL or AGPL exposure into a proprietary codebase.

Practical Steps for Startups

The legal position will continue to evolve, and the federal government has indicated that further reforms are likely. In the meantime, here is what a sensible startup should do.

1. Assume your AI-generated content is not copyrighted unless a human meaningfully shaped the expression. If you need a piece of content to be defensible as a copyright work — a flagship marketing campaign, a published whitepaper, a software product — make sure a human author is doing real creative work on top of the AI output, and document that contribution.

2. Update your IP assignment clauses. Your employee and contractor agreements should already contain robust IP assignment provisions. Make sure those provisions are broad enough to capture not just “works” but also any contributions to AI prompts, AI workflows, and AI-assisted output. The objective is to ensure that whatever rights do exist, end up with the company.

3. Set internal policies on AI use. A short, clear policy about which AI tools are approved, what data can and cannot be put into them, and how AI-assisted work should be reviewed before publication is much more valuable than a long policy that nobody reads. Pay particular attention to confidential information and personal information — most providers’ free tiers retain prompts for training, which can create privacy and trade secret issues.

4. Use enterprise tiers where it matters. Most major providers now offer business or enterprise tiers that include data protection commitments (no training on your inputs), output indemnities, and clearer contractual rights. For any production use of AI in a startup that handles customer data or generates commercial content, the enterprise tier is usually worth the premium.

5. Audit your codebase for AI-assisted code. If your engineers are using Copilot or similar tools, make sure you have mechanisms for detecting and reviewing suggested code that might carry licence obligations. This matters most if you are heading into a fundraising or M&A transaction, where IP due diligence will look closely at the provenance of your code.

6. Don’t rely on the provider’s IP assignment as a substitute for original work. The terms are useful but they are not magic. They give you a contractual commitment from the provider; they do not create copyright where none exists.

The Bottom Line

Generative AI is an extraordinarily useful tool for startups, but it sits awkwardly with a copyright system designed for human authorship. The result is that much of the content your startup produces with AI may not be protected by copyright at all — and the contractual assignments in the providers’ terms of service do not change that fundamental position.

That is not necessarily a disaster. Most startup content is valuable for a few weeks or months, not because of any defensive copyright claim. But if you are creating something you actually need to protect — a product, a brand asset, a piece of software — make sure there is real human authorship involved, document it, and make sure your IP assignment paperwork captures it.

If you need help reviewing your IP assignment clauses, drafting an AI use policy, or thinking through the IP implications of how your startup uses generative AI, get in touch.

Recent Articles

blog-image
AI-Generated Content and Copyright: Who Owns What Your Startup Creates with AI?

If your startup is using ChatGPT to draft marketing copy, Midjourney to create launch graphics, GitHub Copilot to write code, or any of the dozens of generative AI tools that have become standard …

blog-image
The Right to Disconnect: What Australia's New Laws Mean for Startups and Remote Teams

If you run a startup in Australia, the chances are good that your team communicates on Slack at odd hours, that someone occasionally fires off an email at 10pm, and that the line between …

blog-image
Side Letters in VC Deals: What They Are and Why Australian Founders Should Care

You have just agreed on a term sheet. The lead investor’s lawyers have sent through the subscription agreement, the shareholders’ agreement, and the company constitution amendments. You …