Seems like the thousands of people who bought it care, myself included. My role has significant on-call responsibility. It's nice to be able to take it charged anywhere I could need it and know that for several hours it can handle anything I will throw at it, reliably.
From a security perspective, I prefer the separation from the x86 architecture if for no other reason than it imposes extra cost for exploit development.
That wasn't the point. On a long enough timeline, the security of all systems drops to zero. If the entire world is running x86, then the threat model for attackers revolves around abusing x86. If things are homogeneous, it raises the bar and resourcing required to make attacks.
As always with security, everything is a tradeoff.
Your argument is basically security by obscurity. You're better off in an ecosystem where a lot of attention is paid to exploits and patches then in another where it might be a long time before a zero day becomes known and fixes are issued.
None of what i said is security by obscurity (which is also can be an effective tactic,but obviously not the only tactic).
There are only so many human hours and minds interested or allocated to exploitation and offensive security. If everyone used the same architecture for everything, the economies of scale on the offensive side (due to state funded actors) would blow everyone else out of the water.
From a software perspective, Windows has an incredible amount of skilled eyes on each patch release, but we still see new exploits. Same for Linux. Likely same for MacOS.
All i'm advocating for is that having separate hardware architectures is good because it raises the barrier to entry, even if it's only the next marginal step.
Security by obscurity isn’t even bad. It’s only bad if it’s your sole defence.
I am confident that my non-default SSH ports, that only accept connections after a sequence of port knocking, adds a slight bit of security to nothing. For example: xz backdoor.
> My role has significant on-call responsibility. It's nice to be able to take it charged anywhere I could need it and know that for several hours it can handle anything I will throw at it, reliably.
You could do that for years with an extended battery in ThinkPads (and Dell I think, never used them). More so, you could just bring additional charged batteries with you, in case you really needed.
But yes, it took a little more space and weight than M1.
So M1 didn't allowed people on-call to be able "to throw anything at it for several hours", the tech was already there. The only thing M1 did bring is a small savings in the weight and space for the people who needs a glorified terminal with them.
The M-serious brought an absolutely insanely big jump on the performance-energy efficiency graph, and that’s without doubt. Nothing came even remotely close to that before.
Those ultrabooks were insanely expensive and had like 3 hours of battery life.
Ok, 4 hours vs 10. It is still the difference between not powering off between bringing it from one power source to another and an actually portable use-case.
I sold it because I am not on all anymore but my GPD Pocket 2 was much smaller (it would fit in the pocket of my pants) and it would charge on a 20W smartphone usb-c charger.
My wild guess is if you are taking a laptop with you, you are likely to use some kind of bag so having a charger around is no big deal, especially if you are in someone's home. And if you are in a social place, you won't stay connected for hours as it is pretty antisocial, you are likely to have to have several phone calls and you will be asked to leave anyway so you usually fix the problem quickly and if not possible you just pay the bill and go back home.
I like how you ignored the price, build and battery capacity.
And how ignore where this thread started: "on-call".
There is a lot of things in M-series which are good, but it's totally not the price, not the starting model performance [for anything better than being a terminal] and definitely no 'the autonomy breakthrough'.
[1] says "AMD 5850U beats Apple M1 in multicore performance by 29% having the same power consumption (15 Watt)", "Nothing came even remotely close to that" sounds like an exaggeration if that was to be believed.
I don't know whether you like that aesthetics of those laptops or whether the price fits in your budget or not, but I kind of feel that getting 3 or 4 hours of battery is price you pay for falling for way too thin flashy white brushed aluminum stuffs. Hardcore roadwarriors like ThinkPad X1 Nano users seem to be reporting 7-10 hours new for their machines.
I'm not saying that I think M-series CPUs are all fad - every datapoints available suggest it's just -fine- , I'm just saying they're probably not like decades ahead of everything else.
Why even bother linking to a comment that was thoroughly debunked at the time? Making a claim about power consumption (ie. a measured quantity) while using a number that represents TDP (a value assigned by the marketing department with no objective connection to reality) may have been an honest mistake for the original poster, but is quite dishonest on your part.
I wonder what makes you choose such expressions as "thoroughly debunked", but that aside, It's not like 15W processor actually runs at 95W. It's still a close enough number to real power draw, especially when a laptop with "15W" CPU comparable to M1 runs for same 10 hours anyway.
From a security perspective, I prefer the separation from the x86 architecture if for no other reason than it imposes extra cost for exploit development.