I presume you meant one way NDA, your overall point is a really good one. Contracts are very useful as a leading indicator of how the counterparty thinks about the relationship.
If you go interview at a Facebook office, you'll be asked to sign one before they let you in. Basically - "we'll tell you stuff. That's confidential. You tell us stuff. That's not confidential".
FWIW professional liability insurance absolutely can make sense, even be necessary when writing software, depending on the nature of the contract and your overall responsibilities.
FWIW in my experience building both, hardware is always finished first because it’s cheaper to change the software later in the cycle. Much like drywallers patching over electrical/plumbing sins, software fills gaps …
> Fwiw, all (lossless) compression algorithms will increase the size of some inputs.
They rarely meaningfully increase the size though. Typical compression algorithms, when faced with hard to compress sections, can add a handful of bits to wrap them up. Total overhead a fraction of a percent.
When QOI encounters noisy pixels, it has to spend an extra byte per pixel, inflating the file by 33%.
"On paper this was supposed to lead to the immanent collapse of it's service."
I don't know anyone who expected this. The typical failure mode is slow degradation and lack of new development, not sudden collapse. Services become flakier, innovation stops. There was probably some fat to cut,as you put it, but the concept of eating your seed corn is also relevant.
May have been the wrong time too. 1999 was chock full of companies that failed to get traction and died during the dot-com collapse, but variants became much more successful 20 years later. Much of this was mostly waiting in infrastructure I suspect.
well being homoiconic and dynamic helps quite a bit... This being said, if you squint a bit and get used to the syntax, c++ variadic templates are just a compile-time lisp (really templates are just generalized functions over types) and the template mechanism is 100% pure, with a runtime capability of evaluating those pure monadic computational effects defined at compile-time to runtime, there is no more boundary (not saying it's a thing that should be done all the time). The main advantage then over functional languages is the fact that c++ optimizing compilers are already pretty good at optimization so assuming that you can afford to re-compile at runtime the tight inner loops or critical paths (say at "configuration time" when adding some latency might not be a big deal), a lot of otherwise impossible optimizations could probably be done better (thinking of loop invariants, polyhedral, unrolling, constant propagation, aliasing, row major to column major etc etc) probably the result would also be better than what a JIT compiler and profiler would be able to achieve too.