> Was literally just transcoding some video, playing a podcast, and browsing the web.
Yeah that's the perfect use case for current system design. Nobody sane wants to turn that case into an embedded system running a single process with hard deadline guarantees. Your laptop may not be ideal for controlling a couple of tonnes of steel at high speed, for example. Start thinking about how you would design for that and you'll see the point (whether you want to agree or not).
Apologies, almost missed that you had commented here.
I confess I assumed writing controllers for a couple of tonnes of steel at high speed would not use the same system design as a higher level computer would? In particular, I would not expect most embedded applications to use virtual memory? Is that no longer the case?
This isn't really answering my question. Have they started using virtual memory in hard real time applications? Just generally searching the term confirms that they are still seen as not compatible.
In addition to search engines you can learn a great deal about all sorts of things using an LLM. This works well enough if you don't want to pay. They are very patient and you canb go as deep as you want. https://duckduckgo.com/?q=DuckDuckGo+AI+Chat&ia=chat&duckai=...
Yeah that's the perfect use case for current system design. Nobody sane wants to turn that case into an embedded system running a single process with hard deadline guarantees. Your laptop may not be ideal for controlling a couple of tonnes of steel at high speed, for example. Start thinking about how you would design for that and you'll see the point (whether you want to agree or not).