Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Am I alone in thinking this is easy?

The human making the decision is always liable.

What if the human couldn't reasonably know better? Doesn't matter - If they made the same decision without AI or with old files it is still on them.

What if there's no single human decision? Someone is in charge and is responsible. The "I was ordered to" isn't a defense.

Does liability without power make sense? People executing have the power to execute. So liability. If they're executing without power that is a different liability, but a liability.

It may let the powerful off the hook - That is already a theme and AI doesn't change that, in fact, it will just be used as another scapegoat.

God told me to do it - Water tight! Right?



Let's say I start an AI program and my initial prompt is "Copy these files to this other computer", and then 100 iterations down the agentic loop the AI decides to hack into Tesla's FSD and ships an update that kills 500 people.

Who is liable?


Obviously this is up to courts and juries to hammer out but...

- Your agentic loop hacked something? You're liable. - FSD crashes? The guy in the driver's seat is liable. He/his insurance can sue Tesla to spread the liability...

Nowhere along the line will anyone go "Oh, the AI did it... whoops"


I don’t know.

Let’s say someone sells me a shovel and markets it as a shovel. Then the shovel explodes because it was actually a bomb.

Presumably the manufacturer is liable for passing off their bomb as a shovel.

This metaphor seems reasonably accurate for current LLMs


> Someone is in charge and is responsible.

This doesn't seem to be a given.


There's always a CEO or a president. The buck always stops somewhere. Somebody is always making the big bucks because they are in charge.


For legally incorporated companies. What if it's just a crypto wallet hooked up to a network of prompts? Are you sure a human must have created it?


Somebody is collecting those funds. Someone wrote those prompts. There's always a human to blame somewhere.


I'm not aware of any counter-example, but I also don't know any reason why this must be true. And furthermore, I would expect that this will get more likely over time.


It's possible, in theory, that an AI could establish a crypto wallet, but what would they do with it? AI doesn't have desires. It doesn't do what it isn't told to do (although those instructions can be broad and vague). Even if an AI did somehow do something bad without being told, that AI would still be set up by a human and running on some human's hardware and using a human's internet connection.

Maybe in the distant sci-fi future we'll have actual AI (not just glorified chatbots) and AI will be able to decide for itself what it wants to do with its time and we'll be allowing AI to sign leases on property and set up accounts with utility companies, and if that day comes we're going to have a lot of problems if we're not ready for it, but until then it's AI on a human's hardware at a human's property running up a human's electric bill.


I'm sure you can pay for hosting somewhere without being a human.

It won't need to sign a lease to do any of this. It doesn't even require desires. I'm not sure why it seems so far fetched to you.


I have a copy of the weights on my HD and to my knowledge, it hasn't spontaneously went out and acquired web hosting and stood up a web site.

That doesn't seem to be something any AI company says is currently possible.

If things change, and AI becomes able to act on its own initiative, then it will be easy to change this law


I think it's just a gap in definitions. The labs say models don't act on their own initiative. What counts as initiative? I guess an API call in a for loop would count.

Historically it seems like a lot of laws haven't been easy to change. Especially when they regulate zillion dollar industries.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: