UB is usually not as obvious as a divide-by-zero-constant in plain sight. The divide-by-zero-constant could be the result of a complex constant folding operation, involving values injected into the build process via build options or code generation, similar for any other instance of undefined behaviour.
I don't think any programmer puts UB on purpose into the code, even if they can enumerate from memory all 200 or so cases of UB just in the C standard (no idea how big the list is in the C++ standard - thousands maybe?).
The original sin was compiler writers exploiting UB for optimisations in bizarre ways instead of working with the C and C++ committees to fix the language to enable those types of optimizations without requiring UB, or at least to classify UB into different 'hazard categories' (most types of UB in the standard are completely irrelevant for code generation and optimization)
> I don't think any programmer puts UB on purpose into the code
I agree with you. But the problem is not people putting UB in their code, either on purpose or by mistake: we all do that, every day!
The problem is people trying to defend their code once it has been made clear to them that it contains UB, and trying to fight the compiler rather than fix their error.
Do you maintain this stance when the UB is scattered around third party libraries that aren't especially welcoming of patches intended to obfuscate the code while placating some subset of compilers?
How about when code is written correctly and then later the standards body makes previously implementation defined behaviour into undefined? I've got some code that calls realloc that WG14 declared to be undefined years after I wrote it and I doubt that experience is unique.
The evangelical attitude that the standard committee knows best and your code is wrong and you should immediately down tools to work around the compiler noticing the opportunity to miscompile it is quite popular. I think it's a really compelling argument to build nothing whatsoever on the ISO C/C++ stack and replace what you do have with something less hostile, aka anything whatsoever - rust, python, raw machine code written in hex - none of them have this active hostility towards the dev baked into the language design.
ISO C only specifies minimal requirements. It exists because people sit together and agree on those. If people do not agree we can not standardize it. We are also not the ones making your compiler miscompile your code. The compiler people write the compiler that miscompiles your code! They same people sit in the committee and do not agree to changes that would specify other behavior. So if you do not like what your compiler does, this would be the place to go to and complain.
For realloc, different implementations did different things and clearly said they will not change. There wasn't really any other choice. If your program was written for one implementation where it works, it can continue to do so, but it was never portable to other implementations. The standard now simply reflects this reality.
ISO says it's ok to aggressively rewrite programs on the assumption that no undefined behaviour ever executes. At least some compiler developers are incentivised to do whatever makes benchmarks faster.
Put the two together and you get a fast and fragile language implementation. I know why the benchmark people push the compiler in that direction. I'm doubtful that WG21 or WG14 especially want this emergent property.
My suspicion is that this is an accident of history that has too much unwarranted inertia behind it. The moral stance that it's all lesser programmers erroneously writing wrong code is aggravating in that context as it actively opposes anyone making things better.
But ISO does not explicitly say anything like "it is ok to aggressively rewrite programs...". ISO C says, "xy is defined and z is not defined in the standard and then an implementation is free to do whatever this wants". Note this is generally how standards works. You put them blame on ISO to give implementers the freedom, but not on the implementors to exploit this aggressively. Why do programmers still rush to the compilers that exploit this the most instead of choosing other ones? Why do they not complain more? Unfortunately, the effect of blaming ISO has exactly the opposite of what you may want to achieve. It deflects the criticism from the ones who do the decisions to ISO who can only harmonize behavior but does not really have the power to force implementors to do anything, as the realloc story shows. Even worse, this criticism weakens ISO even more, so we will become even less likely that we will be able to fix things using the limited de-facto power a standard has.
No, it is definitely ISO C (and C++) fault [1]. Without guidance of what's acceptable behaviour and what isn't, DWIM and "please don't break my code" is not a spec. Consider type punning via enum, which is still UB in c++ while explicitly supported by most implementations. Most users are wary of making use of it because of possible, if only theoretical, portability problems. Generally provenance, aliasing and the basic memory model are a mess and need a more rigorous underpinning after having been patched during the years with sometimes conflicting wordings.
Compare to the concurrent memory model: while DRF-SC still has a plenty of UB, it is at least possible for a competent programmer to figure out the correctness of their code.
I certainly do not believe that is realistic to stamp out all UB from C and C++ (at least while pretending that the resulting languages have anything to do with the original ones), but there is a lot that the standard could do to try to limit the most egregious cases, possibly providing different levels of conformance (like it is done for floats and IEE754).
[1] of course implementors are part of the committee so they are not blameless.
I agree about the state of many things as you say and especially the point about more rigorous underpinning. But the flow of innovation and progress is not meant to come from ISO and flow to the compilers, the process is designed the other way round: Compilers implement things and ISO integrates this into the standard to avoid divergence. An ISO committee neither has the power nor resources to do what you want, and it is also not meant to work this way (we try nevertheless). But ideally, compilers vendors would need to work on fixing all these things and then ISO could simply standardize it. But for this, one would need to put pressure onto compiler vendors to actively work on those problems, and not blame ISO for not doing things it wasn't designed to do.
I don't think any programmer puts UB on purpose into the code, even if they can enumerate from memory all 200 or so cases of UB just in the C standard (no idea how big the list is in the C++ standard - thousands maybe?).
The original sin was compiler writers exploiting UB for optimisations in bizarre ways instead of working with the C and C++ committees to fix the language to enable those types of optimizations without requiring UB, or at least to classify UB into different 'hazard categories' (most types of UB in the standard are completely irrelevant for code generation and optimization)