You seem to be really dumbing down AGI, considering AGI will likely do everything a human can do, as well as many things a human cannot do, and all of those things will be done vastly faster and better.
Your statement augments my point. All human beings should be concerned about their place in a system that doesn't need them, to the extent it's constructive to think about.