Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The buried lede here is Antrhopic will need to attempt to explain to a judge that it is impossible to de-train 7M books from their models.


I'm hoping they fail to incentivize using legal, open, and/or licensed data. Then, thry might have to attempt to train a Claude-class model on legal data. Then, I'll have a great, legal model to use. :)


How come? They just need to delete the model and train a new one without those books.


Or they could be forced to settle a price for access to the books.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: