The Supreme Court Closed the Door on AI Copyright. The Hard Questions Are Still Wide Open.
In March 2026, the United States Supreme Court declined to hear Thaler v. Perlmutter, leaving in place a ruling from the D.C. Circuit that a work generated entirely by artificial intelligence — without meaningful human creative input — cannot be protected by copyright law. The Copyright Office had rejected the application. The district court agreed. The D.C. Circuit agreed. The Supreme Court declined to say otherwise.
That is, for the purposes of the legal question Stephen Thaler was asking, the end of the road.
Thaler is a computer scientist who developed an AI system called DABUS — the Device for the Autonomous Bootstrapping of Unified Sentience, which is exactly as academic-sounding as it sounds. He applied to register a copyright in an image that DABUS generated entirely on its own, listed DABUS as the author, and claimed ownership as a work-made-for-hire claimant. The Copyright Office said no. The courts said no. The Supreme Court said nothing, which in this context means the same thing.
The legal basis is a straightforward reading of the Copyright Act: it protects "original works of authorship," and authorship requires a human author. Works without human creative contribution cannot be registered, regardless of how sophisticated the system that produced them.
That much is settled. What is not settled — and what courts and the Copyright Office are still actively working through — is everything else.
The Question Nobody Is Really Asking, and the Questions Everybody Actually Has
The questions that arrive at my desk are not about whether someone can list a machine as an author on a copyright registration application. Nobody is asking that. The questions sound like this:
I wrote software and used an AI coding assistant to generate significant portions of the actual code. Do I own that?
I gave an AI platform a detailed creative brief — character descriptions, plot structure, tone, thematic elements — and it produced a draft manuscript. I revised it substantially over several months. Is that protectable?
My company generates product descriptions, ad copy, and marketing content using AI tools. We write the prompts and sometimes edit the output. Who owns that content?
I used an AI image generator with very specific prompts — I described the composition, the color palette, the mood, the style reference, the lighting. Is the resulting image mine to use commercially and to protect?
These are not the same question as Thaler. They are harder questions, and the honest answer to each of them is that it depends — on the nature of the human contribution, the extent of the creative decisions the human actually made, how the work was developed, and how well any of that can be documented.
What Copyright Law Has Always Cared About
Copyright has always required originality, and originality has always required some modicum of human creative expression. The Supreme Court established in Feist Publications v. Rural Telephone Service (1991) that mere labor, what the Court called "sweat of the brow,” is not enough. There has to be creative selection and arrangement. That principle maps onto the AI question in ways that are both intuitive and legally meaningful.
The Copyright Office's current framework for AI-assisted works tries to apply a spectrum analysis. Pure AI output with no human creative input is not protectable, full stop. A work that a human author created using AI as a tool, the way a photographer uses a camera or a minor AI assist in Photoshop or Lightroom or a novelist uses a word processor with a nudge from AI editing, may be protectable, depending on the creative choices the human actually made and expressed in the final work. Everything in between requires a case-by-case analysis of the human contribution.
The factors that matter include how specific and creative the prompt was, how many iterations the human worked through before arriving at the final result, how substantially the human edited or transformed the AI's output, and whether the final work reflects creative decisions that the human made rather than choices the AI made on its own. "I typed a vague instruction and accepted the first output" is a very different situation from "I spent three months developing a detailed creative framework, reviewed dozens of generated drafts, made substantial changes to each, and assembled the final work through my own editorial judgment."
This is not a bright line. It is a factual inquiry, and it applies across every category of AI-assisted work.
Code
Software developers and companies building software products face a version of this question constantly, and often without realizing it. AI coding assistants have become standard tools in many development workflows. A developer might write the architecture, the specifications, and the function calls while an AI assistant fills in the actual implementation. Who owns the resulting code?
For now, what is clear is that code a human wrote is protectable. What is genuinely uncertain is how courts and the Copyright Office will treat code where the human made the design decisions — the architecture, the logic flow, the requirements — but an AI tool wrote the lines that execute them. The Copyright Office has indicated it will register AI-assisted works where humans made protectable creative choices in the final result. It has also been examining applications that describe heavy AI involvement more carefully than it used to.
For companies building software products, this matters practically. Unprotected code is code a competitor can copy without legal consequence. And the value of software development investment is difficult to defend if the underlying work lacks copyright protection. Getting ahead of this question — understanding your workflow, documenting your process, and registering works where appropriate — is worth doing now.
Creative Content
The same framework applies to novels, scripts, marketing content, game narratives, song lyrics, and visual art. The relevant question is always the same: what did the human actually contribute, and is that contribution the kind of creative expression copyright was designed to protect?
An author who uses an AI tool to generate a first draft and then rewrites it substantially — making character decisions, restructuring plot, refining language, developing themes — is in a materially different position than someone who generates output, lightly edits a few sentences, and calls it finished work. Both may have a copyright claim; the strength and scope of that claim will differ considerably.
This is not just an academic concern. Publishers, streaming platforms, and content licensees are starting to ask about AI involvement in works they acquire. Investors in creative businesses are beginning to treat AI-generated content as a potential liability if copyright protection is uncertain. And creators who sell or license their work need to understand what they actually own before they sign agreements representing that they do.
The Pattern the Law Is Establishing
The direction of the law is becoming clear, even if the line is not always easy to draw in a specific case. Courts and the Copyright Office are not moving toward protecting AI output. They are moving toward requiring documentation and evidence of genuine human creative contribution as AI-assisted work becomes more common and the claims become harder to evaluate.
The Copyright Office issued guidance in 2023 establishing that it would examine AI-assisted applications on a case-by-case basis, focusing on what the human author contributed. Since then, it has continued refining its approach, and the practical effect is that applicants who want to register AI-assisted works need to be able to describe and demonstrate their creative process. The less the human contribution, the harder that becomes.
Thaler did not create this framework. It confirmed it. And the Supreme Court's refusal to hear the case suggests that this framework — human authorship required, AI output alone not sufficient — is not going to be dislodged anytime soon.
What This Means for Your Business
If your company uses AI to generate content, code, marketing materials, or other works that you intend to own and protect, the time to think carefully about your workflow and your documentation practices is now, before a dispute arises.
Some questions worth asking: Do your contracts with clients, partners, or investors represent that you own your content? If so, do you? Have you documented your creative process in a way that would support a copyright registration application or withstand a challenge? Do you have internal policies governing how employees use AI tools in their work, and do those policies address ownership implications?
The law is still developing at the margins. But "the law is still developing" is not a workable answer when a competitor copies your software, a licensee decides your content is not protectable, or an investor asks about your IP portfolio. The foundational principle, human creativity is required, and AI output alone is not enough, appears settled. Building your workflows and your legal protections around that principle is the practical task in front of most businesses right now.
Jonathan Philllips counsels businesses and creators on copyright registration, intellectual property strategy, and the evolving legal questions around AI-generated and AI-assisted work. If you have questions about how these developments affect your content, your code, or your business, reach out to him.