That's the inverse. Mass surveillance is bad so it should be banned, vs. using AI to thwart proprietary lock-in is good and so shouldn't be banned.
But also, is the inverse even wrong? If some store has a local CCTV that keeps recordings for a month in case someone robs them, there is no central feed/database and no one else can get them without a warrant, that's not really that objectionable. If Amazon pipes the feed from every Ring camera to the government, that's very different.
By "everywhere" I obviously don't mean "on your private property", I mean "everywhere" as in "on every street corner and so on".
If people are OK with their government putting CCTVs on every lamp post on the promise that they are "secure" and "not used to aggregate data and track people" and "only with warrant" then it's kind of "I told you so" when (not if) all of those things turn out to be false.
> using AI to thwart proprietary lock-in is good and so shouldn't be banned.
It's shortsighted because whoever runs LLMs isn't doing it to help you thwart lock in. It might for now but then they don't care about anything for now, they steal content as fast as they can and they lose billions yearly to make sure they are too big too fail. Once they are too big they will tighten the screws and literally they have the freedom to do whatever they want as long as it's legal.
And surprise helping people thwart lock-in is relatively much less legal (in addition to much less profitable) than preventing people from thwarting lock-in.
It's kind of bizarre to see people thinking these LLM operators will be somehow on the side of freedom and copyleft considering what they are doing.
> By "everywhere" I obviously don't mean "on your private property", I mean "everywhere" as in "on every street corner and so on".
If they're on each person's private property then they're on every street corner and so on. The distinction you're really after is between decentralized and centralized control/access, which is rather the point.
> It's kind of bizarre to see people thinking these LLM operators will be somehow on the side of freedom and copyleft considering what they are doing.
You're conflating the operators with the thing itself.
LLMs exist and nobody can un-exist them now because they're really just code and data. The only question is, are they a thing that does what you want because there are good published models that anybody can run on their own hardware, or are the only up-to-date ones corporate and censored and politically compromised by every clodpoll who can stir up a mob?
You really try hard to misunderstand it. A small shop has own cctv to catch intruders = one thing. Local company installing cctv everywhere = different thing. In practice they can be both supplied by one company, centralized and unified and sold and fighting ANY cctv is ultimately the winning move.
> LLMs exist and nobody can un-exist them now because they're really just code and data
"Malware exists and nobody can unexist it now because it's just code and data"
> A small shop has own cctv to catch intruders = one thing. Local company installing cctv everywhere = different thing.
But that's the thing you were implying couldn't be distinguished. Every small shop having its own CCTV is different than one company having cameras everywhere, even if they both result cameras all over the place.
> "Malware exists and nobody can unexist it now because it's just code and data"
Which is accurate. Even if you tried to ban malware, or LLMs, they would still be produced by China et al. And malware is by definition bad, so you're also omitting the thing that matters again, which is that we should not ban the LLMs that aren't bad.
You don't get to unilaterally make laws for the rest of us, which is what you are trying to do when you throw around terms like "stealing" in contexts where they have no legal meaning. Sorry.
If the incumbent copyright interests insist on picking an unnecessary fight with LLMs or AI in general, they will and must lose decisively. That applies to all of the incumbents, from FSF to Disney. Things are different now.
I see; the laws aren't in question or in flux, but it's the judges who are wrong. Enlightening.
I still don't understand how copyright maximalism has suddenly become so popular on a site called "Hacker News." But it's early here, and I'm sure I'm not done learning exciting new things today.
> like LLM or NFT or killer drones, malware isn't bad for somebody.
Malware isn't bad for Russian crime syndicates, but we're generally content to regard them as the adversary and not care about their satisfaction. That isn't the case for someone who wants to use an LLM to fix a bug in their printer. They're doing the good work and people trying to stop them are the adversary.
> which LLM is not made by stealing copyleft code?
Let's drive a stake through this one by going completely the other way. Suppose you train an LLM only on GPL code, and all the people distributing and using it are only distributing its output under the GPL. Regardless of whether that's required, it's allowed, right? How would you accuse any of those people of a GPL violation?
But also, is the inverse even wrong? If some store has a local CCTV that keeps recordings for a month in case someone robs them, there is no central feed/database and no one else can get them without a warrant, that's not really that objectionable. If Amazon pipes the feed from every Ring camera to the government, that's very different.