An article from the Legal Layer newsletter has ignited a significant discussion on Hacker News right now — over 490 comments and 536 points, and the thread is still growing. The question seems simple: when Claude Code writes an entire module for you, who actually owns that code?
The answer turns out to be anything but simple.
Anthropic is perhaps the most generous on paper. They explicitly state that you own all output Claude produces, and they grant you the commercial rights. The problem? Anthropic itself admits that this might be a paper tiger. US copyright law requires human authorship — and an argument can easily be made that AI-generated code without significant human input is simply not copyrightable. You can "own" something that is legally in no man's land.
GitHub Copilot takes a similar approach — they don't claim ownership, you bear the responsibility — but is currently facing an active class-action lawsuit for potentially recreating open-source code without respecting license terms. Gemini says much the same as the others: we claim nothing, you are responsible.
What truly fuels the HN debate is the practical consequence: thousands of developers and companies are currently rolling out production code written by AI assistants, without any clear understanding of the legal ground they stand on. If a competitor copies an AI-generated library you "own," what do you do then?
Another point bubbling up in the thread: the training data these models are built upon is itself under legal pressure. Anthropic recently paid $1.5 billion in a settlement related to training data. Copilot is facing a lawsuit. The very foundation of what these models "know" is legally disputed — and that trickles up to the output they generate.
This is still an early signal from community sources, not a settled legal fact. But that's precisely why it's worth paying attention. Policy circles haven't taken a stance yet, and mainstream tech media has barely touched on the topic. When they do, companies will likely start asking questions that force clearer terms — or legislation.
For you who are currently using Claude Code, Copilot, or Gemini in a commercial project: it might be time to send an email to your legal department.
