I tried it for the first time last night and I was not too impressed. It felt like a solution in search of a problem.
For the past decade Apple’s product lineup has pretty much remained the same (faster, lighter, etc), while they invested heavily in spatial computing (flop) and autonomous vehicles (scrapped). All while totally neglecting the industry they pioneered (Siri) and should have been perfecting. Instead, OpenAI came along and ate their lunch and Apple was caught completely flat-footed. And now they’re way behind.
It’s shocking to me how bad the above bets and non-investments have played out for Apple. Tim Cook needs to ride off into the sunset. It’s been a good ride. He ran the business well. They need an innovative CEO at the helm that can make better bets.
> For the past decade Apple’s product lineup has pretty much remained the same
You seem to have forgotten the M1 Mac, which was released in 2020. It was followed by the complete overhaul and revitalization of the Mac lineup, all to critical and popular acclaim.
I'd prefer them to keep using Intel, if you ask me. M1 brings to me nothing but issues with different architecture. So much that I had to switch to Linux because can't take it anymore.
Don't really feel it faster than Intel, TBH. May be better regarding battery, but who cares, definitely not me, working from AC outlet 24/7.
Seems like the thousands of people who bought it care, myself included. My role has significant on-call responsibility. It's nice to be able to take it charged anywhere I could need it and know that for several hours it can handle anything I will throw at it, reliably.
From a security perspective, I prefer the separation from the x86 architecture if for no other reason than it imposes extra cost for exploit development.
That wasn't the point. On a long enough timeline, the security of all systems drops to zero. If the entire world is running x86, then the threat model for attackers revolves around abusing x86. If things are homogeneous, it raises the bar and resourcing required to make attacks.
As always with security, everything is a tradeoff.
Your argument is basically security by obscurity. You're better off in an ecosystem where a lot of attention is paid to exploits and patches then in another where it might be a long time before a zero day becomes known and fixes are issued.
None of what i said is security by obscurity (which is also can be an effective tactic,but obviously not the only tactic).
There are only so many human hours and minds interested or allocated to exploitation and offensive security. If everyone used the same architecture for everything, the economies of scale on the offensive side (due to state funded actors) would blow everyone else out of the water.
From a software perspective, Windows has an incredible amount of skilled eyes on each patch release, but we still see new exploits. Same for Linux. Likely same for MacOS.
All i'm advocating for is that having separate hardware architectures is good because it raises the barrier to entry, even if it's only the next marginal step.
Security by obscurity isn’t even bad. It’s only bad if it’s your sole defence.
I am confident that my non-default SSH ports, that only accept connections after a sequence of port knocking, adds a slight bit of security to nothing. For example: xz backdoor.
> My role has significant on-call responsibility. It's nice to be able to take it charged anywhere I could need it and know that for several hours it can handle anything I will throw at it, reliably.
You could do that for years with an extended battery in ThinkPads (and Dell I think, never used them). More so, you could just bring additional charged batteries with you, in case you really needed.
But yes, it took a little more space and weight than M1.
So M1 didn't allowed people on-call to be able "to throw anything at it for several hours", the tech was already there. The only thing M1 did bring is a small savings in the weight and space for the people who needs a glorified terminal with them.
The M-serious brought an absolutely insanely big jump on the performance-energy efficiency graph, and that’s without doubt. Nothing came even remotely close to that before.
Those ultrabooks were insanely expensive and had like 3 hours of battery life.
Ok, 4 hours vs 10. It is still the difference between not powering off between bringing it from one power source to another and an actually portable use-case.
I sold it because I am not on all anymore but my GPD Pocket 2 was much smaller (it would fit in the pocket of my pants) and it would charge on a 20W smartphone usb-c charger.
My wild guess is if you are taking a laptop with you, you are likely to use some kind of bag so having a charger around is no big deal, especially if you are in someone's home. And if you are in a social place, you won't stay connected for hours as it is pretty antisocial, you are likely to have to have several phone calls and you will be asked to leave anyway so you usually fix the problem quickly and if not possible you just pay the bill and go back home.
I like how you ignored the price, build and battery capacity.
And how ignore where this thread started: "on-call".
There is a lot of things in M-series which are good, but it's totally not the price, not the starting model performance [for anything better than being a terminal] and definitely no 'the autonomy breakthrough'.
[1] says "AMD 5850U beats Apple M1 in multicore performance by 29% having the same power consumption (15 Watt)", "Nothing came even remotely close to that" sounds like an exaggeration if that was to be believed.
I don't know whether you like that aesthetics of those laptops or whether the price fits in your budget or not, but I kind of feel that getting 3 or 4 hours of battery is price you pay for falling for way too thin flashy white brushed aluminum stuffs. Hardcore roadwarriors like ThinkPad X1 Nano users seem to be reporting 7-10 hours new for their machines.
I'm not saying that I think M-series CPUs are all fad - every datapoints available suggest it's just -fine- , I'm just saying they're probably not like decades ahead of everything else.
Why even bother linking to a comment that was thoroughly debunked at the time? Making a claim about power consumption (ie. a measured quantity) while using a number that represents TDP (a value assigned by the marketing department with no objective connection to reality) may have been an honest mistake for the original poster, but is quite dishonest on your part.
I wonder what makes you choose such expressions as "thoroughly debunked", but that aside, It's not like 15W processor actually runs at 95W. It's still a close enough number to real power draw, especially when a laptop with "15W" CPU comparable to M1 runs for same 10 hours anyway.
That's great if all you do with a computer is browse facebook (or hacker news). Pros of all kinds have specific platform/software requirements and Apple chucked a lot of that out the window (again) with a move to a new platform - and it isn't optional, intel Macs will be extinct soon and so will software support for them. They don't care about pros, they care about money and the money is in selling to people who only browse facebook.
Apple Macintosh started on Motorola processors. Then they moved to PowerPC processors. Then they moved to Intel processors. And now they've moved to ARM processors. It's happened 4 times in the same product line.
2006 wasn't the first such transition Apple did, just the previous one. Their transitions from 68k to PPC and from classic MacOS to OS X were similar to the PPC to x86 and x86 to ARM transitions from an application compatibility standpoint.
i was in the same situation with the switch to silicon. I have some obscure industry software that i use on a rare basis that only runs on x86 - i just bought a cheap thinkpad (x220 i think) and have XP on there.
silicon (m2) is worth its weight in gold to me for the battery life and functionality.
I think that depends on how you're defining product lineup here. Apple doesn't sell the M1 chip itself, so really the product lineup remained unchanged other than shipping with a new processor.
The watch is the most recent new product I can think of from Apple and that was 9 years ago.
if by "complete overhaul and revitalization of the Mac lineup" you mean took their own chip production in-house, i agree. most revolutionary thing Apple has done since the iPhone.
I’d say “way” is a stretch, and the technologies that largely are making it such could be argued were popularized by ggml which was written originally for the M1 platform.
You seem to have forgotten that "Dodge swapped out the Chrysler engine for a Mitsubishi engine" isn't much of an innovation claim. Tim Cook is a bean counter. He's done nothing at Apple except add a couple of iPhone accessories and optimize the supply chain, which is all the M1 Mac was, a slightly optimized Steve Jobs creation.
Perhaps but after a spike in 2020-21 their Mac sales are about where they were back in the Intel days.
If you take into account the inflation since ~2017 M1 hasn’t really been that extraordinarily successful and their past couple of quarters were the worst in the last 10 years or so.
The Age of Diminishing Return for Digital Innovations began right after the smartphone and it has only become more pronounced since.
My layperson’s prediction for the next big thing is cheaper and cheaper techs. I believe that there will be a race to the bottom. Both hardware and software will become commodity. Even all these new AI technologies will become commodities. The true Cyberpunk take is that technology is boring and commonplace.
Lots of amazing hardware is still way too expensive to make it into ordinary consumer hands, or hasn't even been invented yet.
What's keeping the future bleak is a low business appetite for risk. The wringing out of current tech for every last cent will eventually stop when there's no other way to profit but actual R&D. Until then, yeah things will be boring nonsense like AI and ridiculous marketing. We're in the doldrums until some heads start to roll.
What would be examples of such hardware? Any exciting tech is commoditised fast. Vision Pro would go down $500 in a few years if there would be demand.
I can see that happening. My vision for the future is that voice assistants and LLMs will converge into something as useful and powerful as “Computer” in Star Trek. And that this assistant that will know you and all of your accounts/online profiles, will only need a microphone and an internet connection to function.
So the Apple Watch, for example, could be everything you need to carry around because the voice assistant can literally do everything you can possible think of (besides consume social media and YouTube content).
> So the Apple Watch, for example, could be everything you need to carry around because the voice assistant can literally do everything you can possible think of (besides consume social media and YouTube content).
An all knowing ever listening assistant strapped to your wrist, completely connected to all of your accounts financial, medical, social media, monitoring your vitals, measuring your acceleration, deceleration, steps, altitude, and gps position, bundling it up, and then selling it to whoever wants it for pennies on the dollar.
This is going to be an absolutely enormous market, and it would be really good if an open source option wins it, instead of another big tech walled garden.
The good news is that it’s a software solution, not hardware. Open source can only win in software. So it has a chance.
Open source, self-hosted, all-knowing assistant connected to the Internet, that doesn’t leak data to companies, and that filters out all their ads, messaging, and CTA’s.
Tech-savvy people will be able to have one of these pretty soon even if big tech wins the broader market.
Open source can never "win" a consumer market because a team of engineers is never going to be able to live off of giving away software for free, nor can you sell free software to the public at large. Open source works in B2B contexts because businesses either collaborate on building infrastructure they need to sell their core product, or because you can sometimes sell support and commercial licenses to businesses.
This is precisely why Linux has overwhelmingly won the server market for at least a decade, but it is still a bit player in consumer devices (except Android, which is barely open source and is controlled by a huge corporation selling our data).
Perhaps open source doesn't devalue consumers in a way that they are a thing to be won - not even in aggregate.
The network devices in my home are open source, as is most of the OS. With my customers it's a mix - but where there is open source, it pretty much just works.
Meanwhile, any complaining is typically due to the ongoing poor treatment by Microsoft, Intuit, et al.
> Perhaps open source doesn't devalue consumers in a way that they are a thing to be won - not even in aggregate.
You can win a market, that doesn't mean you're winning the participants. You could at best be winning them over, which is a positive for anyone.
> The network devices in my home are open source, as is most of the OS.
The software on those devices is open source, or in other words, the companies selling your devices are using OSS to power them (or you yourself installed software mostly created for this purpose by similar companies).
> This is going to be an absolutely enormous market, and it would be really good if an open source option wins it, instead of another big tech walled garden.
That’s a nice thought, but IBM would probably just buy it then kill it.
I think it's worse than that. It's an age when technological innovation is not only diminishingly useful, it's actively, primarily, intended to hurt our interests. Surveillance and enshittification and advertising have become the reason for innovation, and actual benefit to consumers is secondary and shallow.
My prediction is that the combination of AI and robotics will conquer specific business applications, and in time will make its way into consumer markets. Think iRobot & robot lawnmowers, but for way broader applications.
That's the current playbook isn't it? OpenAI has basically said they're trying to bootstrap robotic intelligence. It remains to be seen if they can manage that before the cost of running huge gpu farms brings them down.
Within the next couple decades there will be another disruptive innovation in a new hardware form factor. Maybe not AR/VR goggles but something else that no one beyond a few visionaries is even thinking about today.
OpenAI isn’t really a competitor to Siri. It’s a competitor to Google Search. And Apple has never been in the search market.
OpenAI doesn’t work without internet. Siri does. OpenAI can’t do anything on your phone. Siri can. They’re fundamentally different services.
Yes, whatever Apple does with Siri next will be LLM based and will compete with other on-device LLM assistants. But I expect that to be a fairly different market from the cloud-based LLM assistants.
Hell, the next Siri might just forward questions to a third party cloud based LLM assistant for answers to complex questions. Like it does with google now and did with wolfram alpha at one point
Though I expect Apple is also waiting for the quality of LLMl AIs to be good enough. Even ChatGPT 4 isn’t really good enough to be trusted. They’re way too confident when they should be answering “I don’t know” or “I’m unsure but I’m guessing the answer is: …”
Like, what Microsoft is doing is absolutely reckless. It’s insane to integrate an AI assistant that constantly and confidently lies to you into the OS.
It’s not a competitor to Siri…yet. But it’s pretty easy to see the two converging in the near future. The best way to make sure that your voice assistant is used and useful is to back it with an LLM that literally does and knows everything. That’s the future OpenAI is chasing, and the one Apple should be chasing. I’m hopeful that this year brings some quick advancements in this area. But judging by the fact that it’s a year behind OpenAI in the timing, it’s hard to believe that Apple isn’t behind right now.
I’m confused. Which of Apple’s lunch is OpenAI eating? OpenAI does not offer a voice assistant that competes with Siri. Google and Amazon do, and their usage is way down as well; voice assistants in general have lost consumer preference.
You must not have tried the voice feature in the ChatGPT app yet. Try it out, it’s amazing. It pairs the power and functionality of a conversational voice assistant with all the power of ChatGPT and LLMs. Obviously the integration isn’t as convenient as Siri’s yet, but the utility is infinitely higher.
My point is that it seems pretty clear that the future is in the space that OpenAI is right now. And that isn’t a bet that Apple was investing in very heavily.
Apple is also not in the business of losing money. It's much easier to make something new and shiny when you can run it at a loss and your investors are happy for you to light piles of money on fire year after year.
That's also probably why most AI research published by Apple is about on-device inference. It's expensive to run inference servers at scale. Apple is a hardware company, so it makes sense they want to focus on what you can do on a local device (or more accurately, how they can sell you a new piece of hardware).
Interestingly, for my elderly Russian-speaking mom’s benefit, I had an English conversation with it and asked it to reply only in Russian. It complied, and spoke and reasoned no worse than usual.
Not yet. Voice assistants will massively benefit from LLM integration which will benefit OpenAI for sure.
It's just that nobody has built one yet, I'm surprised because it's a very suitable application. But I think the cost is much higher than the current scripted models, which means there must be a payment model attached. Right now all the major voice assistants are free and I have a feeling they're all waiting to see who makes the first paid LLM-based product, and how the market reacts.
I think there is an important distinction to make. Siri is actually two things — there are voice informers (“Please explain how photosynthesis works”) and voice assistants (“Remind me to call Dave when I reach my work location”).
The former is really straightforward to implement with an LLM — it’s basically what an LLM is.
I mean, the latter is basically OpenAI’s assistant, that has some APIs it can autonomously call.
The open problem here is making it on-device, and privacy preserving. Though I’m optimistic about this, as Apple has bought up a huge number of AI startups in the last couple of years, so they are probably onto something.
> All while totally neglecting the industry they pioneered (Siri) and should have been perfecting.
Going by the rumors, there should be some interesting announcements at WWDC that may explain where all their effort has been going instead of improving the current version of Siri.
> They need an innovative CEO at the helm that can make better bets.
ARM chips maybe? How the hell do you space something like that out? That's like the only thing of consequence that any of these companies have achieved since 2020
The gushing of the Apple faithful was eye rolling.
> Om Malik, who has been writing about tech since tech reporters used to write about calculators, he was even more effusive. “It’s amazing! It’s incredible!” he enthused. “You can feel a vibration in the universe!”
For the past decade Apple’s product lineup has pretty much remained the same (faster, lighter, etc), while they invested heavily in spatial computing (flop) and autonomous vehicles (scrapped). All while totally neglecting the industry they pioneered (Siri) and should have been perfecting. Instead, OpenAI came along and ate their lunch and Apple was caught completely flat-footed. And now they’re way behind.
It’s shocking to me how bad the above bets and non-investments have played out for Apple. Tim Cook needs to ride off into the sunset. It’s been a good ride. He ran the business well. They need an innovative CEO at the helm that can make better bets.