What the surface level take about calculators misses is that the average person can't do arithmetic in their head because they don't need to, but they also don't pull out a calculator in the many times a day it would be useful, like at the grocery store. People make horrible decisions with everyday home economics math and are taken advantage of.
The lesson isn't that we survived calculators, it's that they did dull us, and our general thinking and creativity are about to get likewise dulled. Which is much scarier.
Actually the experience with calculators portends a dismal future.
Before calculators, i.e. slide rules, log tables, hand arithmetic: by the time engineers completed their university education most could approximate relevant parameters in their work to +/- 5% or the actual value. Slide rules would give you a result to 3 (rarely 4) significant decimals, but you needed to know the expected result to within half an order of magnitude.
After calculators, many graduate engineers will accept erroneous results from their calculations without noticing orders of magnitude discrepencies.
We constantly hear of spreadsheet errors making their way into critical projects.
With AI the risk is that even currently levels of critical thinking will be eroded.
The amount of college educated people that do not now how to calculate a tip in their head is terrifying.
I can understand not being able to get 17.5% down to the penny. But 10%, 15% or 20% can be calculated in your head faster than I can get my phone out. This level of math is pretty basic.
Its also worth saying that I was never described as a "math person". The number of people that will blindly accept what the calculator tells them is too fucking high.
I have already noticed far too many people using chatGPT as a source. I have a tax attorney friend who got in an argument with an agent at the CRA (Canada Revenue) over whether her interpretation of a rule was correct or whether the chatGPT interpretation was correct. Mind you, she works as a prosecuting attorney so it wasn't adversarial, it was just her saying, "sorry, I'm the legal expert, this interpretation is incorrect, and we will lose if we use this interpretation".
Working with computers I mentally use a hybrid of BCD and binary arithmetic. e.g. for the 17.5%, I mentally move the decimal point to the left, drop the cents, add half and then half of that, i.e. 10% + 5% + 2.5%
Totally agree. The pessimist in me says that part of this is unavoidable. Our tools specialize so that we can direct our limited resources elsewhere, as a consequence of delegating those particular abilities atrophy in us.
Not being able to organize information, create a synthesis, or express yourself in less-likely-than-a-LLM terms is going to have detrimental effects. I think not only will it lead to insane, horse-blinder level, hyper specialization, but it will flatten entire segments of the human experience.
> We constantly hear of spreadsheet errors making their way into critical projects.
Are there any examples, i.e. spreadsheet mistakes in engineering projects that wouldn’t have happened if a slide rule was used? This sounds interesting.
I only know about spreadsheet errors in general, e.g. gene symbols being converted to dates[1]. Unless you meant that?
The lesson isn't that we survived calculators, it's that they did dull us, and our general thinking and creativity are about to get likewise dulled. Which is much scarier.