One major gripe I have with these C++ updates is that the proportion of codebases that actually use recent C++ features (cpp17, cpp20, cpp23) is very close to zero. The more and more esoteric this language becomes, the fewer people who can actually master it.
I’ve been writing C++ over 20 years. The language is a freak show, combining the solid industrial tooling and userbase, with some development efforts led by a clown-car full of pretentious academics and people who just want to bolt on new stuff for no good reason except to ”keep the language fresh”.
C++ is not supposed to be fresh. It’s supposed to be portable, and allow fine tuning of programs to bare metal while allowing a sort of high level implementation of API:s (but often fragile and badly designed).
Some new features are excellent, others are not, and the history is plagued with weird historical feature gaps obvious to anyone familiar at all with more consistent languages.
So if something feels weird, there is always a good chance it’s not you, it’s the language (committee).
C++ certainly suffers from somewhat of a kitchen-sink nature. However, if you consider two of its design goals being:
* Support for multiple, different, programming paradigms.
* Very strong backwards compatibility, all the way back to C.
... then some "freakness" is to be expected. And I do believe some of the additions (to the library and the languages) have been excessive. However, I disagree with your characterization of language development work.
1. Most people on the committee, AFAICT, are from industry rather than academia. And if you consider national bodies, I think the ratio is even higher.
2. "Keeping the language fresh" is not a goal and not what the committee does. Most of what's added to the language are things that people have been complaining about the lack of for _decades_.
3. Feature proponents are those who want to "bolt on new stuff". Committee members are tasked with preventing new stuff being just bolted on.
4. Some new additions are necessary, and others are not necessary but useful, for "tuning programs to bare metal".
Finally - I agree that committee-work has the drawback of less consistency; and there are definitely warts. But for an established language with huge existing codebases and many stake-holders, and with the design goals I mentioned above in mind - an international committee and consensus-building is better than appointing some benevolent dictator.
> Most of what's added to the language are things that people have been complaining about the lack of for _decades_.
And are useless now, because everybody who had that problem either already solved it (the solution could have been "use another language") or did realise that it is not worth the hassle.
I guess the best examples are `std::format` or `std::thread`.
> But for an established language with huge existing codebases and many stake-holders, and with the design goals I mentioned above in mind - an international committee and consensus-building is better than appointing some benevolent dictator.
That depends, but yes, everything is better than letting Stroustroup "decide".
I don’t think the committee / proposals process is necessarily bad. It is a good way to develop a formal specification for a portable and highly complex language with many pitfalls and serious, industrial-level legacy compatibility requirements.
It might be better if it had a true BDFL, instead of a spiritual guide, and I do worry about the committee getting too far ahead of the industry and leaving it behind, plus what will happen when Stroustrup finally retires in earnest.
But yeah, now and then it does produce a turd, and there’s only so much turd-polishing you can really do.
I guess I’m just saying it’s a development model with pros and cons. The pros are necessary. The associated cons are inevitable.
To be specific I was not critizing or promoting any particular governance or design model. Just that this particular authority has had it’s more dysfunctional moments in it’s output - one should not presume all features of C++ are splendid examples of software design.
You can find std::string_view (C++17) in Google's WebGPU implementation [1], static_assert (C++17) in Protobufs [2], <bit> (C++20) in React Native [3], and std::format (C++20) in Cuda Core Compute [4]. So the big names in tech aren't afraid to add -std=c++20 to their build scripts. On the other hand, C++23 features aren't as common yet, but it's still very fresh and MSVC support is WIP.
I'd venture a guess that string_view, static_assert and bit were already a part of respective codebases, just in-house versions. These are very commonly used. So seeing them getting adopted is completely unsurprising.
However the adoption rates of newer C++ features are in fact new are way lower. From what I see lots of projects still use the language as C with Classes, basically, and that ain't going to change any time soon. The GP nailed it - C++ is adding a lot of esoteric stuff that very few people actually need or want.
Imagine how widespread use of Java 8, .NET Framework, Python 2, C89 is still around the industry and now apply it to C++ versions.
There is a reason why C++17 is the best we can currently hope for in what concerns portable code, given the actual support across industry compilers, and company project guidelines.
Many embedded shops might still be discussing between adopting C++11 or C++14.
I agree, but there's a big difference between saying some industries or companies are still targeting old standards and saying there's "near zero" adoption of new standards. The latter just isn't accurate from what I see.
I've been writing C++ for well over 30 years. I'm currently employed full-time maintaining the C++ toolchain, runtime, and standard libraries for a major commercial embedded OS. I see a lot of C++17 being used by my customers every day. It's there, running everything around you.
C++20 is still too fresh for my industry, especially for embedded where runtimes require certification for functional safety. Maybe in two years.
What can I tell The Committee? Stop. No, we don't need a single central ex cathedra library for networking. Or graphics. Or SIMD. Even the existing filesystem library is so broken it's dangerous (the standard specifies if it's used on an actual filesystem it's undefined behaviour -- which means using <filesystem> means your program could provoke the legendary nasal daemons just by being run). Stick to generic basics and leave the specialized stuff that not everyone needs to third-party libaries. Nothing wrong with a marketplace of libraries to serve an entire economy of requirements.
Standard SIMD everyone can build on top of sounds like a great idea--no unnecessary fragmentation due to using different subtly (or not) incompatible libraries. SIMD instructions are in desktop CPU since 90s. It is long overdue.
> the proportion of codebases that actually use recent C++ features (cpp17, cpp20, cpp23) is very close to zero
~Nobody uses all the recent features, but some new C++20 stuff does get adopted very quickly, like 3-way comparisons, constinit, abbreviated function template, etc.
For C++23, support for it is severely lacking in MSVC at least, so that's going to severely impact users.
I'm puzzled by this statement. In all three places I worked in the last 7 years, we actively pushed for the newest language standards. We're very eager for the c++23 switch to arrive so we can finally derive from std:: variant. And we're using a good subset of c++20 currently.
In some ways, you’re not wrong. In other ways, there’s been extremely broad support for some major new features in the language in recent years, like coroutines and concepts.
>One major gripe I have with these C++ updates is that the proportion of codebases that actually use recent C++ features (cpp17, cpp20, cpp23) is very close to zero
It depends what industry you're working on. A lot of HFT shops keep up to date with the latest compiler and make extensive use of new features that improve the ergonomics and compile-time performance of template metaprogramming, which is important for achieving the lowest possible latency.
> One major gripe I have with these C++ updates is that the proportion of codebases that actually use recent C++ features (cpp17, cpp20, cpp23) is very close to zero.
Anecdotally, all the stuff I do has been minimum C++20 for a few years. If you're using e.g. Qt 6, released in 2020, you're using at least C++17 features without knowing it ; same for recent versions of boost which start depending on C++17 or C++20 depending on the features / libraries.
Same with cars, buildings made out of newly discovered building materials, and electronics. I would argue this is a good thing for the same roughly the same reasons - rewriting software to use latest and greatest language feature is usually not efficient.
- Can't update the compiler (eg, porting the code base to the new compiler is too complicated)
- No compiler support for the new standard that target a specific platform that one still want to support.
- Too much work to update the whole code base to work with the new standard.
- A 3rd party library is not supporting new standard yet.
- The team is reluctant to have to learn new technologies.
Some are somewhat valid reason, some are less, some are indication of deeper problems.
(P.S: My C++ code base is using C++20. Didn't move to C++23 yet because I think some customers might not be ready for it yet for one of these reasons, but I'm going to push for it at some point.)
Compiler support for the platform is the general limit. C++ is very good about not breaking old code so old codebases are easy enough to port and anyone who refuses to learn can keep using the old ways.
> One major gripe I have with these C++ updates...
"One major gripe I have with cars is the number of people that know how to drive one is very close to zero."
Where I work (big tech) everything is c++17. I don't know what the schedule is but in a couple of years every bazel and CMake will get bumped to c++20. And so on.
Because ISO languages are usually designed on paper, with some of the features being done on whatever compiler the respective paper author feels like, thus compilers only rush out to fully support a new standard when it gets officially supported.
With Python, there is no standard per se, it is whatever CPython does, and everyone else has to try to mimic CPython.
I don’t really get this argument. Large C++ codebases are generally divided to libraries. The internal libraries and vendor libraries should both be of high quality. I’m not familiar with industrial use cases where every C++ user would not be a library writer.
> The internal libraries and vendor libraries should both be of high quality.
From my limited experience - high-quality internal libraries are simply not the reality; less likely to be achieved than winning the lottery. Companies typically:
* are not able to identify candidates able of writing high-quality C++
* do not try to attract SW engineers by committing to high-quality code.
* don't believe they should invest developer time in making a library more robust, and bringing them to the level of polish of a popular publicly-available FOSS libraries.
* do not have a culture of acquiring, honing and sharing coding skills and expertise, with the help of actual experts. Again, time and effort is mostly not invested in this.
Either you’ve worked with rookie developers (which is fine, but not ’expected industry baseline’) or in an engineering core lacking years of C++ development. Doing stuff ’the right way’ does not generally need extra resourcing - you simply do it the right way.
Quality gaps like described above - I think this happens when you try to develop C++ without actual experience in C++. C++ is so weird anyone trying to ”do the right thing in the language they are most familiar with” generally get it wrong for the first few years. And then you end up with a quagmire nobody wants to volunteer to clean up.
This is not a skill issue as such or lack of talent. C++ simply is so weird and there is so much bad ”professional advice” that you are expected to loose a few limbs before being able to navigate the design landscape full of mines.
> And then you end up with a quagmire nobody wants to volunteer to clean up.
Not only that, but the rookie developers coming in get inculcated into that. That's what they're used to, and they have all the motivation to continue writing poor code, because they need to avoid their better code clashing with what's already written - clashing compilation-wise and style-wise.
Of course, it's not 100% all bad, there are gradual improvements in some aspects by some developers.
No, and they shouldn’t probably. Most internal libraries don’t have and don’t need to implement novel complex template based specialization - not in their API at least. And stuff that’s internal to library needs to only implement the things the API contract requires - which usually does not require the rigmarole of fully generic ’modern’ template based implementation.
Really excited to see that the battery is entering production. Feels like every year we hear about new batteries in the lab that are "2x-5x as efficient as Lithium ion" but they never seem to actually hit production.
Na batteries are going to be a huge deal. Assuming everything pans out with manufacturing, this will be a game changer for grid storage. Natural gas likely won't be cheaper than doing a battery plant which will be a big deal for peaker plants.
Vercel's website (vercel.com) is just MIND-BLOWING. Like if y'all know their develop.preview.ship. animation, it was nuts. How do people even think of such things!
We use vercel and like it, but there’s a steady stream of regressions daily and they’ve broken our builds by rolling out beta features before (we never agreed to being beta testers.)
Also don’t really like how I have to go in and update my node version every time a new LTS comes along.
Can you detail what you're experiencing? What are you seeing regress daily, and what beta features?
I'm not with Vercel, but my company does extensively use Vercel for many major companies. We have thousands of different Vercel projects for a variety of different companies, none of who are small fish and your experience seems to be the complete opposite of ours?
It’s not me necessarily, it’s vercel stating that they have regressions. For instance today I’ve received two notifications by email from them “ Elevated build queue times for Hobby customers” yesterday I also received two “vercel dashboard errors” on the 8th I received 5 detailing DNS issues. I receive an average of 3 a day but in the last two weeks I’ve gotten close to a hundred in total.