>This doesn't change the fact that there are a large magnitude more iPhones that exist that will never be used for their full potential.
Depends I guess? Having a lot of power on tap can show up in transparent ways as well as obvious apps. Apple is definitely leaning heavily on their SoC for a lot of their computational photography stuff. Pros may prefer to just shoot RAW of course for decent reason, but having a system make "good shots" fairly point-and-shoot at the cost of a ton of compute is a real value add for lots of regular people. A lot of people also do some real gaming on their iPhones these days, and those can absolutely push the GPU. Maybe that doesn't fall into your definition of serious, but I don't think it's quite fair to dismiss any use-case valued by customers. There are also extremely practical energy saving issues like the race to sleep. Often the faster a chip can do a job and then return to hibernation, the better the energy efficiency. A lot of normal usage is very bursty, and like it or not on the modern web there are tons of sites that can hammer SoCs pretty hard. Everyone cares to some degree about battery life and responsiveness in handhelds.
If there was some big cost for this that'd be one thing, but there really isn't given the economics of modern silicon design and fabrication. Apple amortizes R&D big time across the iPhone, iPad, Mac, and even stuff like the Apple TV. Relentlessly pushing forward the units on the phones feeds directly into everything else. And even if most people only use the full potential a fraction of the time that may be a very valuable fraction, and what will take off in the future that might use it isn't always clear.
Though one thing that is clear is that wearable displays and serious AR/VR are the big next disruption/extension event for computing, and Apple like every other player needs to be ready. That's going to take a ton of compute power along with highly evolved environmental sensor usage and fusion. They have to be working towards that for years and years beforehand, and clearly are.
Depends I guess? Having a lot of power on tap can show up in transparent ways as well as obvious apps. Apple is definitely leaning heavily on their SoC for a lot of their computational photography stuff. Pros may prefer to just shoot RAW of course for decent reason, but having a system make "good shots" fairly point-and-shoot at the cost of a ton of compute is a real value add for lots of regular people. A lot of people also do some real gaming on their iPhones these days, and those can absolutely push the GPU. Maybe that doesn't fall into your definition of serious, but I don't think it's quite fair to dismiss any use-case valued by customers. There are also extremely practical energy saving issues like the race to sleep. Often the faster a chip can do a job and then return to hibernation, the better the energy efficiency. A lot of normal usage is very bursty, and like it or not on the modern web there are tons of sites that can hammer SoCs pretty hard. Everyone cares to some degree about battery life and responsiveness in handhelds.
If there was some big cost for this that'd be one thing, but there really isn't given the economics of modern silicon design and fabrication. Apple amortizes R&D big time across the iPhone, iPad, Mac, and even stuff like the Apple TV. Relentlessly pushing forward the units on the phones feeds directly into everything else. And even if most people only use the full potential a fraction of the time that may be a very valuable fraction, and what will take off in the future that might use it isn't always clear.
Though one thing that is clear is that wearable displays and serious AR/VR are the big next disruption/extension event for computing, and Apple like every other player needs to be ready. That's going to take a ton of compute power along with highly evolved environmental sensor usage and fusion. They have to be working towards that for years and years beforehand, and clearly are.