> Which means there's less workers being paid, less taxes, less money to be spent on the economy, which means less money to pay workers, which means... the logical conclusion is "no economy at all".
Except that's not how the economy works.
Suppose you automate web development. Fewer people get paid for that anymore. Does it increase long-term unemployment? Not really, because it creates surplus. Now everybody else has a little extra money they didn't have to spend on web development, and they'll want to buy something with it, so you get new jobs making whatever it is they want to spend the money on instead.
The only way this actually breaks down is if people stop having anything more they want to buy. But that a) seems pretty unlikely and b) implies that we've now fully automated the production of necessities, because otherwise there would be jobs providing healthcare, growing food, building houses, etc.
The flaw is assuming that lower costs “free up” money.
Money isn’t "freed". Money is created. Banks create it when they lend against future income. If automation removes wage income, banks don’t create replacement demand: they redirect credit into assets.
That’s why you can have rising productivity, stagnant wages, booming asset prices, and weak consumption at the same time. The missing variable is where credit is created, not how efficient production is. (Think Japan in the 90s)
If you think the AI threat is real buy real assets now. (not financial IOUs in computer systems)
>Now everybody else has a little extra money they didn't have to spend on web development, and they'll want to buy something with it, so you get new jobs making whatever it is they want to spend the money on instead.
Why assume a business that just boosted profits by reducing headcount would want to spend that surplus on hiring more workers elsewhere? Seems like it would mostly go towards stock buybacks and higher executive pay packages. There might be some leakage into new hiring, but I reckon the overall impact will be intensifying the funneling of money to the top and further hollowing out of the labor market.
But that implicitly assumes all jobs are comparable financially. Sure there’ll always be jobs to do but x number of web devs or whatever is not the same as x numbers of nursing home care workers.
Also in terms of extra money and spending, the logic also breaks a bit because we know that by age cohorts, older cohorts have more money but tend to have less consumer spending than the 25 - 40 cohort.
It's not a matter of scale. If people don't have to spend as much on X then they end up with extra money and will spend it on Y. Jobs then shift from X to Y.
This has been happening for centuries. The large majority of people used to work in agriculture. Now we can produce food with a low single digit percentage of the population. Textiles, transportation, etc. are all much less labor intensive than they were in the days of cobblers and ox carts, yet the 20th century was not marked with a 90% unemployment rate.
It's either one of two things. Either post-scarcity is possible because machines that can collect and assemble resources into whatever anybody wants at no cost are possible, and then nobody needs to work because everything is free. Or it isn't, there are still things machines can't do, and then people have jobs doing that.
Except that's not how the economy works.
Suppose you automate web development. Fewer people get paid for that anymore. Does it increase long-term unemployment? Not really, because it creates surplus. Now everybody else has a little extra money they didn't have to spend on web development, and they'll want to buy something with it, so you get new jobs making whatever it is they want to spend the money on instead.
The only way this actually breaks down is if people stop having anything more they want to buy. But that a) seems pretty unlikely and b) implies that we've now fully automated the production of necessities, because otherwise there would be jobs providing healthcare, growing food, building houses, etc.