I'm not sure what you're getting at here but I'll try to respond. My argument is that "this is just" is meaningless as a way to assess the impact of a technology.
If AI researches say, "this is just X and it can do Y!" then fine, that's just framing for "look: Y". When stochastic parrot guys say "this is just X, what's impressive about that?" it throws me for a loop coz they are are refusing to engage with Y.
I think we disagree about what Y is. My point is that Y is not that different from materially what is possible with a slack bot from circa 2015. Essentially chatgtp is a less efficient way to get to the same outcomes that were already possible. The trick is that it appears to be something it’s not - AGI.
I like your bronze sword analogy. From my point of view chatgtp is not a bronze sword, it’s a Stone Age sword that someone has painted bronze. It has value because people realize the advantage that a true bronze sword would have in a battle. However, when you actually put it through it’s paces you quickly realise it offers no actual value over what came before.
If AI researches say, "this is just X and it can do Y!" then fine, that's just framing for "look: Y". When stochastic parrot guys say "this is just X, what's impressive about that?" it throws me for a loop coz they are are refusing to engage with Y.