Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

When I write software, I know that users will do dumb things and it is my responsibility to write my code in such a way that they can cause no damage.

The legal system should have similar responsibilities. If citizens can report crimes, some citizens will incorrectly report non-crimes. If the legal system can't handle that, the that is entirely the fault of the legal system.



"Any proposal must be viewed as follows. Do not pay overly much attention to the benefits that might be delivered were the law in question to be properly enforced, rather one needs to consider the harm done by the improper enforcement of this particular piece of legislation, whatever it might be."

-Lyndon B. Johnson


I think a lot of people take this to mean "laws bad" when in reality I think LBJ was saying to be careful in how you craft your laws so you can prevent this kind of abuse.


If you adapted it for software, it's almost the same as saying "don't focus too much on the happy path, make sure you limit the blast radius for the failures cases", which just sounds like sensible advice.

You really have to take the adversarial mindset when thinking about large scale systems like this, to make sense of how they can break.


Thinking about the failure modes is the default mindset of a traditional engineer. Unfortunately software engineers in general are entirely too focused on the happy path and nothing else.


Focusing on anything besides the happy path requires the approval and support to do so from other teams and their respective management and QA.

No, they'd rather not coordinate any of that and just play the blame game.

Many workplaces employing software engineers are actively hostile towards any kind of engineering. I have a very hard time believing anyone writing code is sloppy on purpose. Incentives are often very misaligned. These are pretty much always organizational problems.

Instead of immediately blaming software engineers, you should consider questioning whether it ever makes sense to have non-technical leadership in charge of them. Most places where "real" engineering is done don't have idiots at the wheel.


Not me.

When I write software, every line of code is accompanied by the thought "I'm a blackhat hacker, with execution thread. How do I leverage this line of code?"


Ironic, given the subject matter. It could do with a corollary: "when making a statement, consider the most simplistic and/or uncharitable interpretation possible".


Another way of putting this might be that it's reasonable to assume that most legislation/laws/procedures/policies are in fact created in good faith, and one can also assume that those advocating for them will do a good job covering the benefits. Where the work is needed, therefore, is in considering the unintended side-effects (of either enforcement or poor enforcement)


That's what I read from it, too.


That puts the onus on solving this problem entirely into the legislative branch.

The executive and judicial branches hold equal responsibility to this problem. Why not include them in the solution?


Thats a great LBJ quote.


The legal system long ago worked around this - they allow virtually every person at every level to exercise "discretion." An officer can frequently choose how to handle an issue, a prosecutor can choose not to prosecute, etc.

Occasionally, we see well meaning policies going astray - the "must arrest someone" domestic calls in jurisdictions are a contentious-but-great example. Sometimes calls get put in spuriously and now someone must go to jail for a relative non-issue / non-altercation that a neighbor called in.


It is good that the agents have this wiggle room, because you cannot craft a law that could handle any case. This is why it is important to have competent people that represent the state and see that laws are followed. On the contrary, people should be incentivised to constantly reflect if a law really serves justice. If not, the legislative branch needs to change it.

Problem here is that the success of policing in politics is measured by number of closed cases or by how many people got convicted after their arrest. By crime rate and success in fighting it. This of course creates perverse incentives for the most part. Budget fights and politics leave justice far behind.

If there is no more crime in a country, it shouldn't mean that police are getting pressured because they arrest fewer people. But exactly that would probably follow in a political discussion.


I’ve noticed this same similarity.

Writing code to affect millions of people’s data, or controlling millions of people’s actions they can perform with software, has a lot in common with passing legislation that will affect millions of people, in terms of the edge cases and unintended consequences that you must consider beforehand.

Not to mention the potential for malicious actors.


Considering that programming language compilers or interpreters are also written by human "software writers", where you the developer are the "user" doing dumb things, hopefully you are using a language that stops you from doing dumb things.

Dumb things like spamming others, issuing hundreds of unnecessary requests to external APIs or databases, trashing all the memory of a system, filling up the disk... A "general purpose" language which basically doesn't exist?

Anything even a bit non-trivial can be used to do dumb things. It is a responsibility for everyone in the chain not to do them, and if dumb things happen, everyone in the chain _is_ responsible.

We can debate how much blame each of them should take, but we should not ignore the blame an individual doing a dumb thing should take simply because there were people in "official" roles being even dumber.

I mean, this was already depicted in the South Park when all the parents get sent to jail by their kids.


I agree with your point in this context 100% but I would like to push back on the vibe in *all* software contexts. Obviously in some contexts sure, nobody wants another Therac-25. However, in general I believe modern software is overly concerned with preventing the user from doing dumb things at the expense of expandability and user control.

Metaphorically it feels like we've adopted a model that has safety interlocks and authenticated access for every screwdriver and light switch in your house. Some of the inaccessible to the end user without serious reverse engineering. To some extent I get why absurd proliferation of IoT devices necessitates some of this but I think there's a better way. A world where we have a decent lock on the front door and you're the one who's responsible for not putting the drill through your eyeball, not the drill.


I would rephrase it as, prosecutorial immunity must die.


That’s part of the problem… everything can’t be reduced to if-then-else; situational awareness, perspective, nuance and understanding are completely missing. In this case, how helpful is it to the child to remove the mother from the child, to teach her a lesson not to remove herself from the child?


My guess is that the mother was removed from the child not as a punishment to her, but in order to ensure the safety of the child. In other words, deterrence rather than retributive justice. FWIW, I do agree that this is a huge overreach on the part of the police and that "situational awareness, perspective, nuance and understanding are completely missing."


More importantly, is making an 8 year old kid a km by themselves in rural America a significant risk ?

I feel like I would have done this 1000 times by the time i was 8 in the rural Australia, is the danger in the US that much larger ?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: