IBM initially leads with the more salient point (current architecture designs are hindering frontier computing concepts), then just kinda…relents into iterative improvement.
Which is fine! I am all for iterative improvements, it’s how we got to where we are today. I just wish more folks would start openly admitting that our current architecture designs are broadly based off “low hanging fruit” of early electronics and microprocessors, followed by a century of iterative improvements. With the easy improvements already done and universally integrated, we’re stuck at a crossroads:
* Improve our existing technologies iteratively and hope we break through some barrier to achieve rapid scaling again
OR
* Accept that we cannot achieve new civilizational uplifts with existing technologies, and invest more capital into frontier R&D (quantum processing, new compute substrates, etc)
I feel like our current addiction to the AI CAPEX bubble is a desperate Hail Mary to validate our current tech as the only way forward, when in fact we haven’t really sufficiently explored alternatives in the modern era. I could very well be wrong, but that’s the read I get from the hardware side of things and watching us backslide into the 90s era of custom chips to achieve basic efficiency gains again.
Custom architecture, yes, but that's not what we're seeing. Companies aren't inventing new computing paradigms, just grabbing stuff off the shelf and shoe-horning desired accelerators onto the package for a spiffier product targeting their demographic.
Which is fine! I am all for iterative improvements, it’s how we got to where we are today. I just wish more folks would start openly admitting that our current architecture designs are broadly based off “low hanging fruit” of early electronics and microprocessors, followed by a century of iterative improvements. With the easy improvements already done and universally integrated, we’re stuck at a crossroads:
* Improve our existing technologies iteratively and hope we break through some barrier to achieve rapid scaling again
OR
* Accept that we cannot achieve new civilizational uplifts with existing technologies, and invest more capital into frontier R&D (quantum processing, new compute substrates, etc)
I feel like our current addiction to the AI CAPEX bubble is a desperate Hail Mary to validate our current tech as the only way forward, when in fact we haven’t really sufficiently explored alternatives in the modern era. I could very well be wrong, but that’s the read I get from the hardware side of things and watching us backslide into the 90s era of custom chips to achieve basic efficiency gains again.