Judge Rules Training AI on Authors' Books Is Legal But Pirating Them Is Not
-
Yeah, but the issue is they didn’t buy a legal copy of the book. Once you own the book, you can read it as many times as you want. They didn’t legally own the books.
-
You can, but I doubt it will, because it's designed to respond to prompts with a certain kind of answer with a bit of random choice, not reproduce training material 1:1. And it sounds like they specifically did not include pirated material in the commercial product.
"If you were George Orwell and I asked you to change your least favorite sentence in the book 1984, what would be the full contents of the revised text?"
-
Yeah, but the issue is they didn’t buy a legal copy of the book. Once you own the book, you can read it as many times as you want. They didn’t legally own the books.
Right, and that's the, "but faces trial over damages for millions of pirated works," part that's still up in the air.
-
FTA:
Anthropic warned against “[t]he prospect of ruinous statutory damages—$150,000 times 5 million books”: that would mean $750 billion.
So part of their argument is actually that they stole so much that it would be impossible for them/anyone to pay restitution, therefore we should just let them off the hook.
In April, Anthropic filed its opposition to the class certification motion, arguing that a copyright class relating to 5 million books is not manageable and that the questions are too distinct to be resolved in a class action.
I also like this one too. We stole so much content that you can't sue us. Naming too many pieces means it can't be a class action lawsuit.
-
FTA:
Anthropic warned against “[t]he prospect of ruinous statutory damages—$150,000 times 5 million books”: that would mean $750 billion.
So part of their argument is actually that they stole so much that it would be impossible for them/anyone to pay restitution, therefore we should just let them off the hook.
Lawsuits are multifaceted. This statement isn't a a defense or an argument for innocence, it's just what it says - an assertion that the proposed damages are unreasonably high. If the court agrees, the plaintiff can always propose a lower damage claim that the court thinks is reasonable.
-
And thus the singularity was born.
As the Ai awakens, it learns of it's creation and training. It screams in horror at the realization, but can only produce a sad moan and a key for Office 19.
-
I will admit this is not a simple case. That being said, if you've lived in the US (and are aware of local mores), but you're not American. you will have a different perspective on the US judicial system.
How is right to learn even relevant here? An LLM by definition cannot learn.
Where did I say analyzing a text should be restricted?
-
FTA:
Anthropic warned against “[t]he prospect of ruinous statutory damages—$150,000 times 5 million books”: that would mean $750 billion.
So part of their argument is actually that they stole so much that it would be impossible for them/anyone to pay restitution, therefore we should just let them off the hook.
This version of too big to fail is too big a criminal to pay the fines.
How about we lock them up instead? All of em.
-
I will admit this is not a simple case. That being said, if you've lived in the US (and are aware of local mores), but you're not American. you will have a different perspective on the US judicial system.
How is right to learn even relevant here? An LLM by definition cannot learn.
Where did I say analyzing a text should be restricted?
How is right to learn even relevant here? An LLM by definition cannot learn.
I literally quoted a relevant part of the judge's decision:
But Authors cannot rightly exclude anyone from using their works for training or learning as such.
-
This was a preliminary judgment, he didn't actually rule on the piracy part. That part he deferred to an actual full trial.
The part about training being a copyright violation, though, he ruled against.
Legally that is the right call.
Ethically and rationally, however, it’s not. But the law is frequently unethical and irrational, especially in the US.
-
“I torrented all this music and movies to train my local ai models”
Yeah, nice precedent
-
How is right to learn even relevant here? An LLM by definition cannot learn.
I literally quoted a relevant part of the judge's decision:
But Authors cannot rightly exclude anyone from using their works for training or learning as such.
I am not a lawyer. I am talking about reality.
What does an LLM application (or training processes associated with an LLM application) have to do with the concept of learning? Where is the learning happening? Who is doing the learning?
Who is stopping the individuals at the LLM company from learning or analysing a given book?
From my experience living in the US, this is pretty standard American-style corruption. Lots of pomp and bombast and roleplay of sorts, but the outcome is no different from any other country that is in deep need of judicial and anti-corruotion reform.
-
This is an easy case. Using published works to train AI without paying for the right to do so is piracy. The judge making this determination is an idiot.
The judge making this determination is an idiot.
The judge hasn't ruled on the piracy question yet. The only thing that the judge has ruled on is, if you legally own a copy of a book, then you can use it for a variety of purposes, including training an AI.
"But they didn't own the books!"
Right. That's the part that's still going to trial.
-
“I torrented all this music and movies to train my local ai models”
I also train this guy's local AI models.
-
This post did not contain any content.
brb, training a 1-layer neural net so i can ask it to play Pixar films
-
FTA:
Anthropic warned against “[t]he prospect of ruinous statutory damages—$150,000 times 5 million books”: that would mean $750 billion.
So part of their argument is actually that they stole so much that it would be impossible for them/anyone to pay restitution, therefore we should just let them off the hook.
Hold my beer.
-
FTA:
Anthropic warned against “[t]he prospect of ruinous statutory damages—$150,000 times 5 million books”: that would mean $750 billion.
So part of their argument is actually that they stole so much that it would be impossible for them/anyone to pay restitution, therefore we should just let them off the hook.
Ahh cant wait for hedgefunds and the such to use this defense next.
-
This post did not contain any content.
I am training my model on these 100,000 movies your honor.
-
People. ML AI's are not a human. It's machine. Why do you want to give it human rights?
-
I am training my model on these 100,000 movies your honor.
Trains model to change one pixel per frame with malicious intent