Skip to content

Judge Rules Training AI on Authors' Books Is Legal But Pirating Them Is Not

Technology
216 113 0
  • Trains model to change one pixel per frame with malicious intent

    From dark gray to slightly darker gray.

  • Yeah I have a bash one liner AI model that ingests your media and spits out a 99.9999999% accurate replica through the power of changing the filename.

    cp

    Out performs the latest and greatest AI models

    I call this legally distinct, this is legal advice.

  • This post did not contain any content.

    Unpopular opinion but I don't see how it could have been different.

    • There's no way the west would give AI lead to China which has no desire or framework to ever accept this.
    • Believe it or not but transformers are actually learning by current definitions and not regurgitating a direct copy. It's transformative work - it's even in the name.
    • This is actually good as it prevents market moat for super rich corporations only which could afford the expensive training datasets.

    This is an absolute win for everyone involved other than copyright hoarders and mega corporations.

  • calm down everyone.
    its only legal for parasitic mega corps, the normal working people will be harassed to suicide same as before.

    its only a crime if the victims was rich or perpetrator was not rich.

    Right. Where's the punishment for Meta who admitted to pirating books?

  • mv will save you some disk space.

    Unless you're moving across partitions it will change the filesystem metadata to move the path, but not actually do anything to the data. Sorry, you failed, it's jail for you.

  • I think this means we can make a torrent client with a built in function that uses 0.1% of 1 CPU core to train an ML model on anything you download. You can download anything legally with it then. 👌

    ...no?

    That's exactly what the ruling prohibits - it's fair use to train AI models on any copies of books that you legally acquired, but never when those books were illegally acquired, as was the case with the books that Anthropic used in their training here.

    This satirical torrent client would be violating the laws just as much as one without any slow training built in.

  • This post did not contain any content.

    This 240TB JBOD full of books? Oh heavens forbid, we didn’t pirate it. It uhh… fell of a truck, yes, fell off a truck.

  • This post did not contain any content.

    It's extremely frustrating to read this comment thread because it's obvious that so many of you didn't actually read the article, or even half-skim the article, or even attempted to even comprehend the title of the article for more than a second.

    For shame.

  • ...no?

    That's exactly what the ruling prohibits - it's fair use to train AI models on any copies of books that you legally acquired, but never when those books were illegally acquired, as was the case with the books that Anthropic used in their training here.

    This satirical torrent client would be violating the laws just as much as one without any slow training built in.

    But if one person buys a book, trains an "AI model" to recite it, then distributes that model we good?

  • But if one person buys a book, trains an "AI model" to recite it, then distributes that model we good?

    I don't think anyone would consider complete verbatim recitement of the material to be anything but a copyright violation, being the exact same thing that you produce.

    Fair use requires the derivative work to be transformative, and no transformation occurs when you verbatim recite something.

  • Unpopular opinion but I don't see how it could have been different.

    • There's no way the west would give AI lead to China which has no desire or framework to ever accept this.
    • Believe it or not but transformers are actually learning by current definitions and not regurgitating a direct copy. It's transformative work - it's even in the name.
    • This is actually good as it prevents market moat for super rich corporations only which could afford the expensive training datasets.

    This is an absolute win for everyone involved other than copyright hoarders and mega corporations.

    1. Idgaf about China and what they do and you shouldn't either, even if US paranoia about them is highly predictable.
    2. Depending on the outputs it's not always that transformative.
    3. The moat would be good actually. The business model of LLMs isn't good, but it's not even viable without massive subsidies, not least of which is taking people's shit without paying.

    It's a huge loss for smaller copyright holders (like the ones that filed this lawsuit) too. They can't afford to fight when they get imitated beyond fair use. Copyright abuse can only be fixed by the very force that creates copyright in the first place: law. The market can't fix that. This just decides winners between competing mega corporations, and even worse, up ends a system that some smaller players have been able to carve a niche in.

    Want to fix copyright? Put real time limits on it. Bind it to a living human only. Make it non-transferable. There's all sorts of ways to fix it, but this isn't it.

    ETA: Anthropic are some bitches. "Oh no the fines would ruin us, our business would go under and we'd never maka da money :*-(" Like yeah, no shit, no one cares. Strictly speaking the fines for ripping a single CD, or making a copy of a single DVD to give to a friend, are so astronomically high as to completely financially ruin the average USAian for life. That sword of Damocles for watching Shrek 2 for your personal enjoyment but in the wrong way has been hanging there for decades, and the only thing that keeps the cord that holds it up strong is the cost of persuing "low-level offenders". If they wanted to they could crush you.

    Anthropic walked right under the sword and assumed their money would protect them from small authors etc. And they were right.

  • I don't think anyone would consider complete verbatim recitement of the material to be anything but a copyright violation, being the exact same thing that you produce.

    Fair use requires the derivative work to be transformative, and no transformation occurs when you verbatim recite something.

    "Recite the complete works of Shakespeare but replace every thirteenth thou with this"

    1. Idgaf about China and what they do and you shouldn't either, even if US paranoia about them is highly predictable.
    2. Depending on the outputs it's not always that transformative.
    3. The moat would be good actually. The business model of LLMs isn't good, but it's not even viable without massive subsidies, not least of which is taking people's shit without paying.

    It's a huge loss for smaller copyright holders (like the ones that filed this lawsuit) too. They can't afford to fight when they get imitated beyond fair use. Copyright abuse can only be fixed by the very force that creates copyright in the first place: law. The market can't fix that. This just decides winners between competing mega corporations, and even worse, up ends a system that some smaller players have been able to carve a niche in.

    Want to fix copyright? Put real time limits on it. Bind it to a living human only. Make it non-transferable. There's all sorts of ways to fix it, but this isn't it.

    ETA: Anthropic are some bitches. "Oh no the fines would ruin us, our business would go under and we'd never maka da money :*-(" Like yeah, no shit, no one cares. Strictly speaking the fines for ripping a single CD, or making a copy of a single DVD to give to a friend, are so astronomically high as to completely financially ruin the average USAian for life. That sword of Damocles for watching Shrek 2 for your personal enjoyment but in the wrong way has been hanging there for decades, and the only thing that keeps the cord that holds it up strong is the cost of persuing "low-level offenders". If they wanted to they could crush you.

    Anthropic walked right under the sword and assumed their money would protect them from small authors etc. And they were right.

    I'll be honest with you - I genuinely sympathize with the cause but I don't see how this could ever be solved with the methods you suggested. The world is not coming together to hold hands and koombayah out of this one. Trade deals are incredibly hard and even harder to enforce so free market is clearly the only path forward here.

  • "Recite the complete works of Shakespeare but replace every thirteenth thou with this"

    I'd be impressed with any model that succeeds with that, but assuming one does, the complete works of Shakespeare are not copyright protected - they have fallen into the public domain since a very long time ago.

    For any works still under copyright protection, it would probably be a case of a trial to determine whether a certain work is transformative enough to be considered fair use. I'd imagine that this would not clear that bar.

    1. Idgaf about China and what they do and you shouldn't either, even if US paranoia about them is highly predictable.
    2. Depending on the outputs it's not always that transformative.
    3. The moat would be good actually. The business model of LLMs isn't good, but it's not even viable without massive subsidies, not least of which is taking people's shit without paying.

    It's a huge loss for smaller copyright holders (like the ones that filed this lawsuit) too. They can't afford to fight when they get imitated beyond fair use. Copyright abuse can only be fixed by the very force that creates copyright in the first place: law. The market can't fix that. This just decides winners between competing mega corporations, and even worse, up ends a system that some smaller players have been able to carve a niche in.

    Want to fix copyright? Put real time limits on it. Bind it to a living human only. Make it non-transferable. There's all sorts of ways to fix it, but this isn't it.

    ETA: Anthropic are some bitches. "Oh no the fines would ruin us, our business would go under and we'd never maka da money :*-(" Like yeah, no shit, no one cares. Strictly speaking the fines for ripping a single CD, or making a copy of a single DVD to give to a friend, are so astronomically high as to completely financially ruin the average USAian for life. That sword of Damocles for watching Shrek 2 for your personal enjoyment but in the wrong way has been hanging there for decades, and the only thing that keeps the cord that holds it up strong is the cost of persuing "low-level offenders". If they wanted to they could crush you.

    Anthropic walked right under the sword and assumed their money would protect them from small authors etc. And they were right.

    Maybe something could be hacked together to fix copyright, but further complication there is just going to make accurate enforcement even harder. And we already have Google (in YouTube) already doing a shitty job of it and that's.... One of the largest companies on earth.

    We should just kill copyright. Yes, it'll disrupt Hollywood. Yes it'll disrupt the music industry. Yes it'll make it even harder to be successful or wealthy as an author. But this is going to happen one way or the other so long as AI can be trained on copyrighted works (and maybe even if not). We might as well get started on the transition early.

  • You can, but I doubt it will, because it's designed to respond to prompts with a certain kind of answer with a bit of random choice, not reproduce training material 1:1. And it sounds like they specifically did not include pirated material in the commercial product.

    Yeah, you can certainly get it to reproduce some pieces (or fragments) of work exactly but definitely not everything. Even a frontier LLM's weights are far too small to fully memorize most of their training data.

  • Unless you're moving across partitions it will change the filesystem metadata to move the path, but not actually do anything to the data. Sorry, you failed, it's jail for you.

    stupid inodes preventing me from burning though my drive life

  • It's extremely frustrating to read this comment thread because it's obvious that so many of you didn't actually read the article, or even half-skim the article, or even attempted to even comprehend the title of the article for more than a second.

    For shame.

    was gonna say, this seems like the best outcome for this particular trial. there was potential for fair use to be compromised, and for piracy to be legal if you're a large corporation. instead, they upheld that you can do what you want with things you have paid for.

  • Unpopular opinion but I don't see how it could have been different.

    • There's no way the west would give AI lead to China which has no desire or framework to ever accept this.
    • Believe it or not but transformers are actually learning by current definitions and not regurgitating a direct copy. It's transformative work - it's even in the name.
    • This is actually good as it prevents market moat for super rich corporations only which could afford the expensive training datasets.

    This is an absolute win for everyone involved other than copyright hoarders and mega corporations.

    You're getting douchevoted because on lemmy any AI-related comment that isn't negative enough about AI is the Devil's Work.

  • It's extremely frustrating to read this comment thread because it's obvious that so many of you didn't actually read the article, or even half-skim the article, or even attempted to even comprehend the title of the article for more than a second.

    For shame.

    Nobody ever reads articles, everybody likes to get angry at headlines, which they wrongly interpret the way it best tickles their rage.

    Regarding the ruling, I agree with you that it's a good thing, in my opinion it makes a lot of sense to allow fair use in this case

  • US immigration enforcement actions trigger social crisis

    Technology technology
    1
    0 Stimmen
    1 Beiträge
    3 Aufrufe
    Niemand hat geantwortet
  • Meta publishes V-Jepa 2 – an AI world model

    Technology technology
    3
    1
    9 Stimmen
    3 Beiträge
    8 Aufrufe
    K
    Yay more hype. Just what we needed more of, it's hype, at last
  • 68 Stimmen
    4 Beiträge
    7 Aufrufe
    jimmydoreisalefty@lemmy.worldJ
    Damn, I heard this mentioned somewhere as well! I don't remember where, though... The CIA is also involved with the cartels in Mexico as well as certain groups in the Middle East. They like to bring "democracy" to many countries that won't become a pawn of the Western regime.
  • 1k Stimmen
    95 Beiträge
    14 Aufrufe
    G
    Obviously the law must be simple enough to follow so that for Jim’s furniture shop is not a problem nor a too high cost to respect it, but it must be clear that if you break it you can cease to exist as company. I think this may be the root of our disagreement, I do not believe that there is any law making body today that is capable of an elegantly simple law. I could be too naive, but I think it is possible. We also definitely have a difference on opinion when it comes to the severity of the infraction, in my mind, while privacy is important, it should not have the same level of punishments associated with it when compared to something on the level of poisoning water ways; I think that a privacy law should hurt but be able to be learned from while in the poison case it should result in the bankruptcy of a company. The severity is directly proportional to the number of people affected. If you violate the privacy of 200 million people is the same that you poison the water of 10 people. And while with the poisoning scenario it could be better to jail the responsible people (for a very, very long time) and let the company survive to clean the water, once your privacy is violated there is no way back, a company could not fix it. The issue we find ourselves with today is that the aggregate of all privacy breaches makes it harmful to the people, but with a sizeable enough fine, I find it hard to believe that there would be major or lasting damage. So how much money your privacy it's worth ? 6 For this reason I don’t think it is wise to write laws that will bankrupt a company off of one infraction which was not directly or indirectly harmful to the physical well being of the people: and I am using indirectly a little bit more strict than I would like to since as I said before, the aggregate of all the information is harmful. The point is that the goal is not to bankrupt companies but to have them behave right. The penalty associated to every law IS the tool that make you respect the law. And it must be so high that you don't want to break the law. I would have to look into the laws in question, but on a surface level I think that any company should be subjected to the same baseline privacy laws, so if there isn’t anything screwy within the law that apple, Google, and Facebook are ignoring, I think it should apply to them. Trust me on this one, direct experience payment processors have a lot more rules to follow to be able to work. I do not want jail time for the CEO by default but he need to know that he will pay personally if the company break the law, it is the only way to make him run the company being sure that it follow the laws. For some reason I don’t have my usual cynicism when it comes to this issue. I think that the magnitude of loses that vested interests have in these companies would make it so that companies would police themselves for fear of losing profits. That being said I wouldn’t be opposed to some form of personal accountability on corporate leadership, but I fear that they will just end up finding a way to create a scapegoat everytime. It is not cynicism. I simply think that a huge fine to a single person (the CEO for example) is useless since it too easy to avoid and if it really huge realistically it would be never paid anyway so nothing usefull since the net worth of this kind of people is only on the paper. So if you slap a 100 billion file to Musk he will never pay because he has not the money to pay even if technically he is worth way more than that. Jail time instead is something that even Musk can experience. In general I like laws that are as objective as possible, I think that a privacy law should be written so that it is very objectively overbearing, but that has a smaller fine associated with it. This way the law is very clear on right and wrong, while also giving the businesses time and incentive to change their practices without having to sink large amount of expenses into lawyers to review every minute detail, which is the logical conclusion of the one infraction bankrupt system that you seem to be supporting. Then you write a law that explicitally state what you can do and what is not allowed is forbidden by default.
  • The Universal Tech Tree

    Technology technology
    1
    1
    21 Stimmen
    1 Beiträge
    3 Aufrufe
    Niemand hat geantwortet
  • Programming languages

    Technology technology
    1
    1
    0 Stimmen
    1 Beiträge
    3 Aufrufe
    Niemand hat geantwortet
  • 366 Stimmen
    198 Beiträge
    6 Aufrufe
    F
    Okay but we were talking about BTC pump and dumps and to perform that on the massive scale which dwarfs any stock ticker below the top 5 by hundreds of billions of dollars while somehow completely illuding people who watch the blockchain like hawks for big movers... It's just not feasible. You would have to be much richer than the official richest man on earth and have almost all of your assets liquid and then on top of that you would need millions of wallets acting asynchronously. And why would you even bother? If you're that rich you could just not hide it.
  • 361 Stimmen
    24 Beiträge
    13 Aufrufe
    F
    If only they didn’t fake it to get their desired result, then maybe it could have been useful. I agree that LiDAR and other technologies should be used in conjunction with regular cameras. I don’t know why anyone would be against that unless they have vested interests. For various reasons though I understand that it isn’t always possible - price being a big one.