The main issues are thought to be
Tampering with airbag timer ignition
Falsely recording test results
Falsification of test speeds
Falsely recording tire pressure
Replacing the acceleration data of the upper portion of passenger seat
The article goes into more depth re each item, DeepL is your friend if you don't read Japanese
I looked into it a few years ago and used it for a couple of hobby projects. I really liked it but I was never going to be able to persuade my company to use it unless it became more popular.
Rust is trying to solve the same problem of being both safe and as performant as C/C++. Although it came along later than D it seems to have been much more successful in gaining traction, most likely because it had the backing of Mozilla.
I would argue that Rust's success compared to D is largely due to its solution to a major problem in C++: memory safety. Beyond that, Rust and D both offer several improvements over C++ in various ways. However, these incremental enhancements are often insufficient to motivate a transition to new languages in an industrial context. This is because it's challenging to justify such a shift based solely on minor improvements that mostly enhance developer quality of life, especially considering the risks associated with interoperability with existing codebases, the availability of skilled developers, and the general preference for conservative approaches. Even with a significant improvement like memory safety, which significantly ups the value proposition, it took nearly a decade after Rust's 1.0 release to really gain traction. The power of established practices is formidable.
People who are at home in a memory managed language tend to have little interest in a language that is slightly more low-level, but still memory managed. I believe that this is the main contributor to Rust's rise. Even if I suspect that most production Rust code is from people coming from the memory managed side, looking for a safer way to avoid any performance compromise.
The quest to improve Java and .NET low level capabilities, Swift, Nim, Chapel, Linear Haskell and OCaml effects, show otherwise.
Rust's sweet spot is on the OS layer, or bare metal workloads, where similarly to high integtry computing, no heap allocations are allowed, or only in very controlled scenarios.
I didn't say that there's no desire for low level capabilities: without that, nobody from managed environments would care about Rust. But to overcome the skillset inertia that keeps people in the language they are already good at, the gap needs to be bigger than "it's still gc, but the runtime is slightly more lightweight". I'd rather consider those projects as evidence of how high that "different enough" threshold needs to be.
Back in 2003 I loved the idea of D, even if I never used it. But then I also loved the idea of C++/CLR, so I would not put too much on my judgement. My opinion about D has changed far less: still have a soft spot for it, just not enough to make the jump
I used D as a hobby back when D1/2 split was not yet fully resolved, probably around 2008--2009. I recall that I was never sure if I should switch to D2 right now, as I wrote quite a bit of code in D1. My original goal was to port some of my games from C to D, as I was already familiar with PARSEC47 written in D1. But the ecosystem was still at the early stage, so my project quickly derailed and I ended up writing some alternative standard library for personal use [1].
Rust hit 0.1 around the same time, and my attention ultimately turned to Rust when it hit 0.5, the version that cemented a concept of lifetime and borrow checker. The same thing happened for the same reason, that's the reason I built Chrono and other well-known libraries at the first place. But I think I sticked to Rust probably because it had a very crude but working package manager in the earliest release. (While it was initially named Cargo, it was renamed to rustpkg and then replaced with a new Cargo shortly before 1.0.) So I had some reason to continue working on my libraries, because people was actively looking for features provided in them while I haven't seen any such movement with D.
I still don't know whether Mozilla was crucial for observed differences between D and Rust. I do believe that Rust needed Mozilla to succeed, but that looks orthogonal to my anecdotes. My current guess is that Rust was the first major programming language that was entirely hosted by Github from the beginning. [2] That arguably made people much easier to search Rust libraries and collaborate on missing pieces. And that's probably what allowed Rust to evolve during multiple breaking changes before 1.0, and a timely introduction of the current Cargo also played a role. [3]
[1] Fun trivia: Some of my (in)famous Rust libraries are originated from those experiences!
[3] Of course it took a lot more time for Rust to become a language that can never go away. That point is generally thought to be an introduction of `async` in 2019, because I've been told multiple times that it was the last major requirement shared by many stakeholders.
D has a bigger scope than Rust. It is Rust‘s scope plus use cases where a garbage collector is appreciated and it provides more meta programming mechanisms.
It might be the most comprehensive programming language ever.
Although I haven't used Nim, something I miss while using rust is `static if`. You can sort of approximate / do a subset of this with #[cfg(xxx)] directives in Rust.
`when myCondition():` instead of `if myCondition:` is done at compile-time.
Alternatively you can use a `static:` code block to force compile time evaluation. Or tag a function {.compileTime.} or tag function inputs with `static` modifier.
It is possible to create a compiler or an assembler running fully in Nim macros as well:
Rust put the focus on the right place, perhaps not entirely on purpose at first, but having learned that this works they've doubled down.
Culture is first. If you have a safety culture, that supports and enhances safety technology, and the resulting software has better safety properties than you'd get even if your technology had been just as good without the culture. If you start with the technology instead a culture which isn't interested just undoes all your good work.
Look at C++ span. This is a slice type, roughly equivalent to Rust's [T]. As originally proposed it has good safety properties, its index operators are bounds checked and it provides safe but fast iteration. WG21 got hold of it, and std::span, the resulting standardized feature in C++ 20, doesn't have any bounds checks, destroying the safety properties. That's a product of a culture which doesn't value safety.
I like D. I just wish it didn't try to chase every trend. Rust has memory safety? Oh we'll add @live and @safe which will kind of implement some of that. Oh, no one wants to put 5 annotation on every function and most already written code doesn't use it anyway? Too bad.
I always viewed D as a new language, which is kind of trying to offer the convenience of C# but as a native language. But sometimes I feel like it's drifting in a different direction. In a direction of nicer C++. I would prefer C/C++ support to be a thin abstraction layer for legacy code. But instead seems like there is more and more integration with C++.
Even language features in D like copy constructors are based on matching the behavior of C++. If I wanted to use C++, I would just use C++ instead of bothering with bridging C++/D code. I think that's one of the appealing things about languages likr Rust and Zig, is that they don't even try to appeal to C++ programmers. Let C++ programmers enjoy C++, but let everyone else move on.
My parents were deaf, so there wasn't much singing to me when I was a baby. But I managed to pick up spoken language anyhow so I doubt that singing is vital as the headline suggests.
Generally as long as babies aren't deprived or abused when they are young, they'll grow up fine.
I'd well believe it. I used to write computer vision applications for semiconductor manufacturing equipment and there we were able to strictly control the distance from camera to object, lighting etc. and even still getting necessary reliability was not simple. When a failure could lead to damaging a whole wafer, i.e. hundreds of thousands of dollars, 99% accurate is not good enough.
It would be very questionable to use this in production, especially as pearls are very glossy and costly, without the error matrix being provided. It would seem more reliable to use the camera as a blocked / non-blocked sensor with what would amount to a cheap coin sorter.
The main issues are thought to be Tampering with airbag timer ignition Falsely recording test results Falsification of test speeds Falsely recording tire pressure Replacing the acceleration data of the upper portion of passenger seat
The article goes into more depth re each item, DeepL is your friend if you don't read Japanese