The one that wasted me hundreds of hours was TypeScript on Node with ESM. The most common, familiar, boring stuff which everyone is intimately familiar with, right?
Got shanghaied into TS-land right around Node 16 when they and TypeScript imposed mutually incompatible handling of ESM modules (that extensions mess).
Not only the type checker fail to understand of the kind of JS I had been shipping (and testing, and documenting, and communicating) up to that point; both the immediate toolchain, and people's whole pattern language around the toolset, were entirely broken as soon as you were doing anything different from the kind of React.js that later became Vercel.
Not only I was able to do 10% of what I was able to do previously conditional on jury-rigging the billion dollar stack to work, I also had a little cadre of happy campers on my ass blatantly gaslighting me that it is all, in fact, working; and suggesting the most inane "solutions" once I'd bent over backwards to demonstrate how there is, indeed, a problem of absurd dimensions, straight outta nowhere.
Later I met more such people. Same people who would insist JS runtimes are not trivially interchangeable, having committed to not examine what they're doing beyond a meager level of abstraction.
I see it as a rather perverse form of "working to spec" (have had to pick up surreal amounts of slack after such characters), but with incentives being what they are you get a cutthroat environment (such as the author of this blog post imagines to be living in), and from a cutthroat environment you get the LLMs eating everyone's breakfasts -- because no matter how yucky a word "synergy" is, synergizin' is that "fake open source" is designed to preclude, throughout the JS ecosystem.
"Fake open source" is how I call MIT/BSD licensed devtools and frameworks from hyperscalers that don't need to do an opencore rugpull because they're a piece of a long-term ecosystem strategy. They benefit from immense decade-long marketing and astroturfing efforts, lending them "default status" in the mindshare; and ultimately serve to carry the vendor's preferred version of reality into unrelated downstream projects. Which is why they often spectacularly fail to respond to the community's needs: they are built to preclude, past a certain point, the empowerment of implementors as a community.
Mastering some of that shit, now there was a sunk cost for me, but in modern JS land all these churning agglomerations play the role of "pay to play" gatekeepers. Considering what that's made the playing field be like, I'm happy pivoting to more niche technology just to keep away from said churning agglomerations.
I'd argue that the mainstream, lowest-common-denominator tools are the ones which waste people's time. (Especially when they're backed by an incumbent. Deno, on the other hand, clicked immediately.)
>Finally, changes need to be stratified along lines of risk rather than code modularity or other dimensions.
Why don't those other dimensions, and especially the code modularity, already reflect the lines of business risk?
Lemme guess, you cargo culted some "best practices" to offload risk awareness, so now your code is organized in "too big to fail" style and matches your vendor's risk profile instead of yours.
> Why don't those other dimensions, and especially the code modularity, already reflect the lines of business risk?
I guess the answer (if you're really asking seriously) is that previously when code production cost so far outweighed everything else, it made sense to structure everything to optimise efficiency in that dimension.
So if a change was implemented, the developer would deliver it as a functional unit that might cut across several lines of risk (low risk changes like updating some CSS sitting along side higher risk like a database migration, all bundled together). Because this was what made it fastest for the developer to implement the code.
Now if AI is doing it, screw how easy or fast it is to make the change. Deliver it in review chunks.
Was the original method cargo culted? I think most of what we do is cargo culted regardless. Virtually the entire software industry is built that way. So probably.
What is the point of the mass surveillance in the first place? Control. Over what? Over human futures. Who will be hit worst by the mass surveillance regime? Those growing up under it.
For starters, an independent self-education will become impossible. Millions more young people would be forced to choose between becoming fluent in whatever maddening proprietary nonsenses their schools are paid to teach them - or ostracism and starvation. They would never know the validity that disintermediated computation lends to one's interior thought process. Many more people would grow into the world of ubiquitous multilevel gaslighting instead of the world of free thought. And that would be those children's life now.
Here's a bit of a doomsday scenario, you can pepper with it your dialogues with people thinking of the children too hard, and you may find their reactions enlightening.
As enmeshed as personal computing and mass media already are with personal life, it can take an organized e***s-minded outfit scant generations to literally devolve your children into a servile underclass. Simply by making access to computation a tightly controlled privilege, and using that to amplify social inequality. (While their own kids get to play out the fantasies dreamt up for them by the colonial laureates of yore, i.e. be immortal trillionaire wizard aristocrats who can work "magic" because they get to learn actual sciences and not just some ever-changing APIs to them. Which would probably fall apart in a few generations making a huge mess of things, potentially permanently bringing down the global supply chain by mass incompetence - but how could they care?)
This is a global legislative assault against the greatest novel liberty humanity has gained from technology for generations: the Internet is literally a means for anyone to project their disembodied thoughts at a distance! Whatever force is even capable of attacking that, it would not be playing for chump change. Nor is it likely to be the unimaginative sort of entity (unless, perhaps, these laws are part of an AGI bootstrapping itself throughout society?) which is why I'm being only slightly anxious about spitballing concrete patterns of defeat in view of it.
And even if we do not end up on the branch of reality where social inequality gets written into the genome and the bloody e***ists win - forcing minors to identify themselves online is sure to facilitate the global cultural conveyor belt that winds through Willy Wonka's Consent Factory Island and beyond.
Plenty of "think of the children" arguments either way if that's how they're playing it. It's a reflexive, non-rational argument, from the same firmware update as "your mom is sacred" (i.e. good luck being child or partner of abuser who had kids to become untouchable). So yeah, do think of the children. Think of their futures. You cowards.
An absolute free market would, by definition, permit the selling of the service "restrict someone's freedom for me".
Not sure if that leaves it a free market. So if we're gonna be talking holes in the cheese - seems like you're reasoning in terms of a basically self-contradictory notion.
But truly, what do you reckon about the 1st point, in terms of the interpretation of market freedom which you use?
Got shanghaied into TS-land right around Node 16 when they and TypeScript imposed mutually incompatible handling of ESM modules (that extensions mess).
Not only the type checker fail to understand of the kind of JS I had been shipping (and testing, and documenting, and communicating) up to that point; both the immediate toolchain, and people's whole pattern language around the toolset, were entirely broken as soon as you were doing anything different from the kind of React.js that later became Vercel.
Not only I was able to do 10% of what I was able to do previously conditional on jury-rigging the billion dollar stack to work, I also had a little cadre of happy campers on my ass blatantly gaslighting me that it is all, in fact, working; and suggesting the most inane "solutions" once I'd bent over backwards to demonstrate how there is, indeed, a problem of absurd dimensions, straight outta nowhere.
Later I met more such people. Same people who would insist JS runtimes are not trivially interchangeable, having committed to not examine what they're doing beyond a meager level of abstraction.
I see it as a rather perverse form of "working to spec" (have had to pick up surreal amounts of slack after such characters), but with incentives being what they are you get a cutthroat environment (such as the author of this blog post imagines to be living in), and from a cutthroat environment you get the LLMs eating everyone's breakfasts -- because no matter how yucky a word "synergy" is, synergizin' is that "fake open source" is designed to preclude, throughout the JS ecosystem.
"Fake open source" is how I call MIT/BSD licensed devtools and frameworks from hyperscalers that don't need to do an opencore rugpull because they're a piece of a long-term ecosystem strategy. They benefit from immense decade-long marketing and astroturfing efforts, lending them "default status" in the mindshare; and ultimately serve to carry the vendor's preferred version of reality into unrelated downstream projects. Which is why they often spectacularly fail to respond to the community's needs: they are built to preclude, past a certain point, the empowerment of implementors as a community.
Mastering some of that shit, now there was a sunk cost for me, but in modern JS land all these churning agglomerations play the role of "pay to play" gatekeepers. Considering what that's made the playing field be like, I'm happy pivoting to more niche technology just to keep away from said churning agglomerations.
reply