Being no expert at c++ (far from it..) and knowing nothing about carbon I'd expect that you'd usually need to do at least some cleanup amongst those "~4 languages" (nice way to put it!) on the c++ side before enjoying interop.
But I'd also assume that much (all?) of that cleanup would be stuff that would be worth doing anyways, even if you sticked to c++, but that you keep postponing...
The issue is that it's very hard to "clean C++ up" because backwards compatibility must be upheld (there are obviously good reasons for that). C and C++ have terrible defaults that mostly cannot be changed anymore. You can try to add warnings for everything (people dislike and disable them), static analysis, valgrind, (not trivial to set up and use correctly) etc. etc. You can add new stuff (e.g. smart pointers) and mandate (e.g. C++ core guidelines) that the old
ways (new/delete) should not be used. I could continue forever describing measures aimed at making C++ safer.
In my opinion all these things have failed. The language is old, has insane amounts of legacy beurocracy and process tied to it, has terrible unchangeable defaults, unchangeable ABI, and is insanely complicated to "get right" (write a function that adds two signed integers and returns the result, that has no undefined behavior) to the point where there is an established tradition not to try to get it right. Yes, smart pointers are nice, concepts are also nice, but this patchwork does not work well with other features, and what this "improvement" process does is make it more complicated. Almost noone really understands it anymore.
It's time to retire it and start over completely, as well as reconsider if all the things that C++ has been traditionally used for warrant such a language. Things like GC languages or Rust/Zig should fill that space. They have decent interop where needed. Meanwhile C++ is not going away for a very long time. I see this project as building another language into C++ that makes it even more complicated.
Yes, c++ (the language) can't be properly cleaned up because support for not just some but all legacy codebases isn't optional at all. But I wasn't talking about cleaning up c++, the language, I was talking about cleaning up c++ codebases to align them with whatever subset of c++ that will play nice with carbon. Surely won't be everything, whatever obscure leftover from 1984 you dig up, but will likely aling well with good somewhat modern (or modernized) codebases. That would mean ending up with code that cleanup process will likely end up restricting itself to a certain subset of c++ that might in fact be a good candidate for a "somewhat cleaned up c++ language", but I agree, that wouldn't be worth it: a properly different thing designed for interoperability with a large subset of c++ will be much easier to learn than a literal subset.
I just don't believe people will do that. How do you to justify investment in "cleaning up" a huge legacy codebase that noone fully understands anymore, written in a patchwork of programing languages that almost noone really understands anymore, by gradually converting it into a new experimental dialect of said languages that might die a few years down the line?
Javascript -> Typescript is just a different domain, one where things move fast, break and are suddenly replaced or thrown away. This C/C++ stuff is slow. The industry is too big, backwards and fragmented to do anything other than start over with a completely new language where possible, and where it's not possible and just keep cleaning the dust from the antiques and hope.
There's huge C++ projects that should be in e.g. Java if started now (I really dislike Java). On the other hand, there's an ongoing effort (been there for decades!) by a cult-like group of C++ programmers, convincing themselves that they can write memory safe code, and trying to convert embedded C programers and convince them to stop dereferencing volatile hex addresses copied from decades old pdf manuals (e.g. because that's UB). These people still don't use compiler optimizations because who knows what that might do. I've used "modern" embedded C++ and mostly gotten reactions along the lines of "C is better, this new stuff is no good, abstractions obscure stuff, how are we supposed to debug which of those hex adresses don't match the pdf?" What do you do with that? It's not unreasonable: the old way worked well enough, changing things are costly and dangerous.
I believe this mess is not a technology problem that can be solved by adding more language features or dialects to C++.
It certainly depends on how much cleanup a given codebase really needs. But if your only other choices are reimplement everything in rust/java/whatever, never ever touch the core and stick to some FFI scripting layer and never hire anyone who doesn't have at least two decades of c++ on their resume, does carbon really look that unattractive?
What impressed me about carbon (apparently so much that I sound like a embarrassing fanboy even to myself, heh) is how determined they seem in trying to avoid scope creep: usually it's quite the opposite, every field of programming that's not entirely ruled out gets declared home turf that will eventually get revolutionized by the language (because a prototype exists and it looks pretty because it's still in that cute puppy stage when all the gnarly bits are conveniently left out)
But I'd also assume that much (all?) of that cleanup would be stuff that would be worth doing anyways, even if you sticked to c++, but that you keep postponing...