I would love to hear counterpoints -- The Sun Ray thin client experience seems interesting, but the modern version of that seems to be the web/app/cloud ecosystem we have now (where the load and storage of your interaction are resident on some other system, potentially freeing up your local device from resource needs). Specifically, a self-hosted collaborative model with Nextcloud + Collabora or similar. I do wonder what workloads or designs would be fit for a more "time-sharing" approach.
I used a Sun Ray for two summers when interning at Sun Labs in 2002 and 2003. They were kind of awesome and kind of sucked at the same time. First, the display wasn't that great for the time, and they were expensive for what you got. Second, we had ours hooked up to (one of) the labs' E10ks. Because it was shared amongst many users, some things could get janky under load. One thing in particular is that certain image formats were heavier than others, and the crappy Firefox at the time could cause jank for multiple users when processing image-heavy webpages.
It was a neat party trick to take your ID card out of your terminal and walk down the hall, put it into someone else's and boom, have your session, but that was more rare than you might think.
All in all I think I would prefer a workstation.
Admins on the other hand, probably preferred these. If the thin clients were more like $100-200, this would have taken over the world. But they were more like $1000+. Sun considered that a bargain. Which shows you what Sun thought consumers.
I do wonder if a lot of the stuff that Google has worked on Google Docs, Chromebook were inspired by Sun. Eric Schmidt was a VP at Sun and Novell before joining Google.
Before 1913, State's legislatures would elect their US Senators. Since 1913, Senators are directly elected but to longer terms than their peers in the House, as a way to make them less beholden to the whims of the zeitgeist and more stable in their consideration of "what serves the state" in that they do not face elections immediately and the results of their work are meant to be evaluated over a longer period. -- this is the intent, reality may bear out differently
I fully expected this to be their entry into Kubernetes automation (brass instruments, pianos of all kinds, motorcycles, Outboard motors, Why not)
The magic here is that all of the components beg for this sort of thing as an obvious next step, but this seems to be done really well. The value is always in the integration of the component systems.
This is a huge factor, and heavily influenced by the purveyors of the technologies involved. A factor I hadn't realized that is implicated in this transition is the shift from the Teacher-led classroom to the device-led classroom. The teacher is no longer seen as the expert, the interpreter, the model figure of the subject when the laptop or tablet is the delivery tool. Students learn that the teacher is a facilitator, likely not up to date on the latest changes to the app interface, and not an authority on the subject.
Device-delivery instead of teacher-delivery puts the student first, even when the student knows nothing, and has zero impulse control.
So instead of modelling a productive and enriching data accessing environment, we're actually just tearing down the walls of the school and asking teachers to babysit the mayhem.
My 34-year old base spec Chevrolet has digital controls for timing advance, fuel trim, and integrated Engine and Transmission Control Units. But my dash has some analog components ( fuel level is variable voltage instead of PWM ). The mechanics would all say that my truck is very simple, and "old school"
The Lay use of 'analog' is far removed from function. As long as there isn't a screen, it isn't seen to be digital. I studied photography in college and loved shooting film. I have a processing machine that is based on a 6502. When people would talk about non-digital things as analog it would bug me (One is chemical, and one is a computer).
The last real analog stuff would be either carb'd bikes / cars or mechanical fuel injection, which is the worst of both worlds.
However, those ECUs are more closely related to embedded programming than digital dial outs and SIM Card loaded cars with a internal network canbus these days. Analog / Digital Inputs and outputs as a closed loop controller.
The first 20yr of automotive computers they weren't really talking to each other and when they were it wasn't really bidirectional and it wasn't typically on a bus unless you wanna call a dedicated wire a bus.
That was after the first 20 years of automotive computers, though, wasn't it, if only barely?
WP says, "In the early 1970s, the Japanese electronics industry began producing integrated circuits and microcontrollers used for controlling engines.[6] The Ford EEC (Electronic Engine Control) system, which used the Toshiba TLCS-12 microprocessor, entered mass production in 1975.[7]" Reference [6] says, "First half of 1970s: Japan starts developing ICs for automobiles ahead of the U.S.: Development of ICs for automobiles started with analog ICs for in-car entertainment, and was followed by 4-bit microcontrollers and other digital ICs for use with the wipers, electronic locks, and dashboard, and then by microcontrollers with 8-bit and wider bits for engine control."
But I don't know any more details. Was Toyota controlling its windshield wipers with a 4004 in 01974? Was Nissan controlling a speedometer with an RCA 1801 in 01973?
Anyway, if we date it from 01975, then 01995 would be year #21.
EEC-1 exists because regulatory uncertainty forced OEMs to build in a place where they could cheaply change the logic around emissions effecting systems. It is a computer in the same way a modern toaster is.
Figure 5yr between "developing" and "fielding". And if you ignore the Plymouth Prowler exercises in putting cutting edge tech into low volume models to get practice doing so it adds another few years depending on what the item is and how bad the OEM wants it in the field.
In any case, by the 1990s these are computers comparable in complexity to the bare minimum it takes to run an oven from 2010 that has a digital timer and some automatic functions. They read inputs and implement simple if-then and timer logic. They don't do any communicating with other systems, and if they do they pretend to be a simple sensor or actuator (depending on which end of the connection they're on). The closest thing you're gonna get to a "bus" is a shared ground or a shared reference voltage circuit.
Take for example a hypothetical 1995ish Ford (EEC-IV, which was pretty advanced for its time) that combines every possible module you can have across the whole lineup. At best you're gonna get is five computers. ABS module reads vehicle speed (VR sensor) and does it's thing. It then sends out approximately the same voltage/frequency signal it got in to the transmission controller or ECU which does the same thing and sends the signal to the digital odometer. The ECU directs the ignition module, but the ignition module isn't really a computer, it's a bunch of solid state circuitry that implements essentially one function which is turning on and off high current in response to a low current signal but with some automatic loops in there (it's broken out from the ECU for cost reasons, on some models it's integrated). You also have a body control module but once again, just simple dumb logic that people the world over implement with analog controls every day, "if engine off then door close wait 30sec before headlights off"
At no point is there bidirectional communication nor is there any sort of bus anywhere. When there is complex feedback where A tells B to do something and then cares about the result, it's architected such that B isn't implementing any logic, it's pretending to be a sensor and an actuator, taking a "do things" signal and returning a "I did things" signal (voltage change usually).
These are not in any way comparable to a modern car where everything shits messages onto various buses and things actually listen for messages, ignore what they don't need, output messages when they've done things, etc, etc.
The only people using CAN in the 90s were Ze Germans, because they're who invented it.
>Was Toyota controlling its windshield wipers with a 4004 in 01974? Was Nissan controlling a speedometer with an RCA 1801 in 01973?
No. Toyota was using analog circuitry for that then and Nissan kept a physical cable at least into the 90s.
> 1981: General Motors introduced its "Computer Command Control" system on all US passenger vehicles for model year 1981. Included in this system is a proprietary 5-pin ALDL that interfaces with the Engine Control Module (ECM) to initiate a diagnostic request and provide a serial data stream. The protocol communicates at 160 baud with Pulse-width modulation (PWM) signaling and monitors all engine management functions. It reports real-time sensor data, component overrides, and Diagnostic Trouble Codes. The specification for this link is as defined by GM's Emissions Control System Project Center document XDE-5024B.[4][5]
This is still a far cry from modern cars using CAN buses for all kinds of things, but it's at least digital communication over a bidirectional data link. And then OBD-I is from 01988.
I've seen a lot of hand-waving explanations of how boot-strapping happens (and granted it is very implementation-specific) but this is delightfully complete without veering off into the rough.
Interface schematic
Architecture reference AND AVR Binary output reference for programming
Achievable
Fun!
I used to use that clip to inspire people to laugh instead of groan while troubleshooting, and I honestly understand that for a product manager, it’s the wrong answer. But for a huge section of the population who don’t have a ground truth against which to evaluate LLM slop or where the definitions are not clear enough, I see often the only measurable thing is time and therefore it becomes the benchmark of good.
It reminds me of the Pentium 4 vs Athlon XP days counting clock speed.
Time is measurable, but they're measuring the wrong interval. Speed to first draft means nothing if every subsequent change takes longer because the code is unmaintainable. The technical debt from slop compounds: you're not saving time, you're borrowing it at a terrible interest rate.
In the long run, the LLM that produces less slop per request wins.