> The web has complexity also of client/server with long delays and syncing client/server and DOM state, and http protocol. Desktop apps and game engines don’t have these problems.
What part of hiding a comment requires a HTTP round trip? In 200ms you could do 20 round trips.
> It's the only usable form of reference! I want all the details to be presented in a reference. Where else?
I guess it's like a dictionary: it's only useful if you know the word you want to look up, rather than reading through every definition until you find the function/library/ability that you want. I do agree though, when I need to look something up, I do want it in great detail - it just isn't a very good learning resource.
> It seems to me the author is confusing lack of familiarity with lack of existence. There are lots of fantastic tools out there, you just need to learn them. They don't know them, so conclude they don't exist.
Can you give some examples? The author made a compelling argument on how easy it is to use the browser debugger. I would be of great interest for something similar.
> We already have all that.
I've only seen these for simple python applications or web development, never in any 'low level' space. And certainly not for doing anything interesting in the low level space (something that is not just a C++ language tutorial).
I think the original article wasn't just proposing the existence of language servers, but specifically that they (that do exist) should be used to help beginners to make the process of low level software development approach the level of ease as web development with it's tooling.
I'm not quite sure what they would look like in practise.
> AI music is the same as AI code. It’s derived from real code, but it’s not just regurgitated wholesale. You still as a person with taste have to guide it and provide inputs.
I guess the difference is proprietary code is mostly not used for training. It's going to be trained on code in the public. It's the inverse for music, where it's being trained on commercial work, not work that has been licensed freely.
People are seemingly very unhappy with the status quo, but also even unhappier when the Government tries to legislate around real issues. For example, people in hacker news seem to bring up grooming rape gangs specifically when talking about "Diversity" in the UK as a cudgel when the UK tries to introduce safety laws.
Meanwhile some of the most prolific child abusers are being sent to jail (who happened to be young 20s and white) who were only enabled to abuse hundreds of young people over a matter of months due to online platforms.
The latter example is the type of thing the UK Government is trying to tackle. The abuse is rife, but people would rather talk about "Diversity" and complain about laws clearly designed to protect children.
Do I want the laws? No. But other people have ruined it, and now we no longer live in a high trust society. I certainly want something that will try to lower the abuse women and children face from the Internet (and men).
I don't understand how comments like yours fundamentally misunderstand both complaints.
Regarding the Rape gangs. The complaint is "People migrated to the country and committed heinous crimes, the local authorities tried to cover it up". Therefore they want these people removed (in some cases they have not been deported) and be more picky about who is allowed to migrate. They also want the people involved in the cover up to face some sort of punishment.
They mention it because they believe it shows the establishments hypocrisy. I don't understand why you and others don't understand this.
> The latter example is the type of thing the UK Government is trying to tackle. The abuse is rife, but people would rather talk about "Diversity" and complain about laws clearly designed to protect children.
The problem is that the "think of the children" arguments are a tried and tested way of deflecting criticism when it comes to any argument about protecting privacy.
People aren't complaining about genuine attempts to catch online predators.
They are complaining about the fact that they have to put to put in their ID to go to Pornhub to watch some chick in her early 20s diddle herself.
I wonder how many people are actually from the UK on these threads. There is always comments about "diversity" and "grooming rape gangs" and how everything labour do is bad, or about how the UK is an oppressive regime or somehow fundamentally anti-freedom. This always reads like fear mongering / Russian psy-ops propaganda to me.
I have many bones to pick with the UK government but a large number of people sprinting to these talking points at every chance they get is highly suspicious to me.
I'm also surprised by the tone of this thread. HN discussions usually involve more nuanced debate, but many comments here are hitting very specific talking point. Comparisons to China, sarcastic references to 'diversity,' grooming gangs, that I more commonly see in certain Reddit communities rather than in typical HN discussions about tech policy or civil liberties.
There are legitimate concerns about UK surveillance, protest policing, and speech regulations worth discussing. But when the same cluster of talking points appears with this particular framing, it makes me wonder about the makeup of who's participating in this thread versus other HN discussions.
Well, yes, because it is designed to protect UK citizens. As much as GDPR applies "everywhere in the world" when interacting with EU citizens.
Just as much as my communications are scanned when interacting with US citizens with PRISM. I'd argue that is exponentially more dangerous and nefarious given it's apparently illegality and (once) top secrecy.
And as far as the PRISM comparison goes, I'd rather mass surveillance not be done at all, but if it's being done no matter what I'd rather it be illegitimate than official policy. At least they have to jump through some hoops for parallel construction that way, and it doesn't normalize the practice as morally/socially acceptable- it's a "secret" because its embarrassing and shameful for it to exist in a "free" society. If its not a secret and nobody is ashamed of it then you dont even have the pretense of a free society anymore
If it's done in the open they can be taken to court. When done in secret, the first challenge is when their defense attorney tells the prosecution to prove it even happens.
Can you give one example of that happening that's not coached in some context? These claims always seem pretty far fetched to me, bordering conspiratorial.
British politicians are weenies who can't handle American banter. They get all bent out of shape when Americans tell British people that their government is replacing them with third worlders and they should start killing their politicians. The British government wants to ban this kind of criticism (probably because they're failing to refute it in the public's eye), but are powerless to stop Americans who are well within their legal rights to say things like this. In the past they relied on American corporations cooperating with their censorious requests even though it wasn't legally required, but now that you have people like Elon Musk openly defying the British government and even seeming to side with the aforementioned critics, the British government is all kinds of pissed.
I don't think you're commenting in good faith at all.
> They get all bent out of shape when Americans tell British people that their government is replacing them with third worlders and they should start killing their politicians.
>but are powerless to stop Americans who are well within their legal rights to say things like this.
> Yep, that's life, if something bothers you and it's already a crime then report it.
I think that's the issue with this, and why we are seeing new laws introduced.
If someone is assaulted in real life, the police can intervene.
If people are constantly assaulted at a premises, that premise can lose it's license (for example a pub or bar).
When moving to the online space, you are now potentially in contact with billions of people, and 'assaults' can be automated. You could send a dick pic to every woman on a platform for example.
At this point the normal policing, and normal 'crime', goes out of the window and becomes entirely unenforcable.
Hence we have these laws pushing this on to the platforms - you can't operate a platform that does not tackle abuse. And for the most part, most platforms know this and have tried to police this themselves, probably because they saw themselves more like 'pubs' in real life where people would interact in mostly good faith.
We've entered an age now of bad faith by default, every interaction is now framed as 'free speech', but they never receive the consequences. I have a feeling that's how the US has ended up with their current administration.
And now the tech platforms are sprinting towards the line of free speech absolutism and removing protections.
And now countries have to implement laws to solve issues that should have just been platform policy enforcement.
Believe it or not, when a crime has been committed these providers universally defer to the police whose remit is enforcement, a role they seem reluctant to undertake, I'm unconvinced this is anything other than a convenient revenue stream, an opportunity to steer public opinion, and a means of quashing dissent.
Frankly, a few dick pics here and there seems wildly low-stakes for such expensive draconian authoritarianism.
I don't think the fine is automatic like that, it's more if you don't have an appropriate mechanism to manage it. In other words you need a content policy that is enforced.
A mod who deletes nude pictures is probably enough to not get fined.
I think the real issue is what I just said... "probably enough"; that's the real problem with the online safety act. People are mostly illiterate on the law, and now asking them to understand a complex law and implement it (even when the actual implementation is not that much effort or any effort at all for well run spaces) is the real issue.
As far as I am aware, 'probably' is about the best you can do, since the OSA is so vaguely defined, it's actually difficult to actually know what is and what isn't valid.
> Letting tech companies self-regulate has failed, and too many people leave morality at the door when engaging online, something which doesn't happen at scale IRL.
I completely agree with this point.
We also have some tech companies (X) openly hostile to the UK Government. At what point does a sovereign country say "you're harming the people here and refuse to change, you're blocked".
Most of these comments I think are off the mark. For some reason anything to do with EU or the UK legislating to protect citizenry is seen as some Orwellian conspiracy to mind control people. I agree some of the policies feel like always using a hammer - but I strongly suspect it's because the tech industry is clearly not playing ball.
Children being sent dick pics, or AI generated nudes of them being sent around schools, etc. are real problems facing real normal people.
I think people here need to be reminded that the average person doesn't care about technology. They will be happy for their phones to automatically block nude pictures by Government rule if the tech companies do not improve their social safety measures. This is the double edged sword: these same people are not tech savvy enough to lock down their children's phones, they expect it to be safe, they paid money for it to be "safe", and even if you lock a phone down, it doesn't stopped their class mates sending them AI porn of other class mates.
Musk is living proof that a non zero number of these giant tech companies are happy for child porn ("fake" or not) to be posted on their platform. If I was in his shoes, it would be pretty high up on my list to make sure Grok isn't posting pornography. It's not hard to be a good person.
The things you mention are already illegal. The effective proven solution is to enforce existing laws, to punish and deter bad behaviour like any other crime.
This incongruence is why a lot of people don't take the reasoning at face value and see it as only rhetorical justification for increased surveillance, which is widely understood as something the state wants do do anyway.
How do you deal with a crime that isn’t reported due to things like shame?
Not to say that we need to scan messages to enforce nudes not to be sent, but I don’t think you can say “just enforce existing laws” and be done with it, it’s not that simple.
If people don’t report crime then we have to leave it. The answer can’t be just to invade everyone’s privacy looking for crimes (before they’re even committed!)
Of the crimes that are reported, I think a lot go nowhere.
The lack of proper enforcement breeds distrust. The government already has access to WAY more data than it needs and... nothing is happening? People are still getting unsolicited dick pics and nobody seems to care?
The only reasonable conclusion is that the government wants the data because it wants it. The crime angle is just one that is easy for people to swallow. See also: think of the children!
Perhaps His Majesty’s Government could establish mandatory thought scanning using cutting edge technology[0] to ensure that no crimes go unreported due to shame, dishonesty, or threats. Just step into the scanning booth once a week, a minor inconvenience to ensure your safety. Surely you have nothing to hide?
I posted a reply here https://news.ycombinator.com/item?id=46599842 that addresses why I think "this is already a crime" doesn't go far enough, and why these laws are being introduced.
> Children being sent dick pics, or AI generated nudes of them being sent around schools, etc. are real problems facing real normal people.
These are relatively minor problems. Certainly not something that warrants invasive government intervention.
If parents are that worried about their kids seeing some porn, then they should either not give smartphones to them at all or install some kind of local protection software.
Can I ask photoshop to generate the image and it magically does it?
If Grok doesn't have the controls in place, and Elon Musk refuses to add them, then it is a child porn machine. I would not want to be on the side defending that.
AI tools have democratised a lot of previously skilled labour, and now law needs to catch up. That is how the world has always worked. I'm sure piracy wasn't a major concern before we had ships transporting commerce.
What part of hiding a comment requires a HTTP round trip? In 200ms you could do 20 round trips.
reply