Hacker Newsnew | past | comments | ask | show | jobs | submit | accidentallfact's commentslogin

It doesn't need aliens. The people would have to encounter such things as the ruins of Jericho (destroyed at the beginning of the new kingdom period), or later cities burned down during the late bronze age collapse. Either could easily represent an extent of destruction incomprehensible to unsophisticated herdsmen. Later it was Greece or even Rome itself, before the area became a part of the empire. It's pretty clear that angelos was something like a courier or mailman, for example, and only later it acquired the mystical meaning.

There are documented cases where aliens weren't needed for that, either:

https://en.wikipedia.org/wiki/Pomio_Kivung


Pretty much all such claims can be easily dismissed by pointing out that such advances

1. Can obviously be made

2. Can be made very fast

There is simply no reason why major advancements in metallurgy couldn't have been made between 4453 and 4382BC, completely unknown to us, and later forgotten.

If fact, it's a mystery why we can't see more of such ancient artifacts, if anything.

The article doesn't even go far enough by blaming the oiling on some accidental dumb ritual, while it used to be common knowledge that iron can be protected from rusting by oiling it, and it was done completely on purpose.


The reason better toolboxes have felt inside the drawers is you put a drop of oil on the felt, and it will keep the tools rust-free.

The thing is, you could probably create something FAR more horrific by simply mixing spent nuclear fuel with TNT...

Not that I want to give anyone any ideas.


> The thing is, you could probably create something FAR more horrific by simply mixing spent nuclear fuel with TNT...

Nope. It turns out you need astronomical amounts of spent waste to noticeably impact a large population.

The trial of Jose Padilla (aka "the dirty bomber") has the best data on this. He went to Al Qaeda, offering to build and detonate a dirty bomb. Al Qaeda wasn't at all interested. They had run the actual numbers from an engineering standpoint (unlike everyone else who had just said "ooh scary bad!"), and demonstrated clearly that dirty bombs aren't actually a viable mass casualty weapon.

Before the Jose Padilla trial, we used to hear lots about dirty bombs. Since then, not at all. It's not that people forgot about them. They just aren't actually a credible engineering threat. It's too hard to get enough material distributed over a large enough area to measurably impact health outcomes for the impacted population. That was a surprise that came out of the trial.

There are lots of attack types to worry about. Dirty bombs are very far down that list.


Not that I want to be necessarily contrarian, but just a few months ago I decided to stop worrying about using my phone, and it honestly feels like the most liberating decision of my life.

There is nothing wrong with it.

I think that many people feel like their lives suck in some way that they can't define or explain, and they want something to blame it on, and their phone is an excellent target. It's relatively new. Of course it's the source of recent problems. It's CONVENIENT. You can do something about it by simply not looking at it.

Your phone is not the source of any of your problems.


>I think that many people feel like their lives suck in some way that they can't define or explain, and they want something to blame it on, and their phone is an excellent target.

They will blame anything but the billionaires.

But to be a devil's advocate: I think most phone issues arise from a child's use of them. They don't have the discipline to put a phone down, and then it enshrines habits that last into adult hood. Gen Z is the testing grounds for such a phenomenon.

Sadly, working adults who need to chat with work, get calls for interviews, schedule and get updates on appointments, and check on family do need to have their phone on the ready. I don't think anyone is condemning the people here. Just the system.


The turn-off-the-phone crowd tend to be in situations where parents, young kids, doctors, interview calls, etc. getting in touch isn't a priority. Yes, voicemail is a partial answer but an imperfect one in this day and age. Didn't even used to have and just got messages on a voicemail device (after the mid-80s or so) but there's a much greater expectation of being able to reach people easily today.

You can do all of these things outside the hour you spent at a cafe. Constant availability is partly self-imposed.

Varies a lot on your job or family situation. As expected, minimum wages jobs can have the most abusive behaviors in terms of respecting your time.

And of course,culture.30 years ago, if your kid got hurt you wouldn't be considered an ignorant parent if the school took an hour to get a hold of you on a phone. They may even call your work and have that relayed over to you. Now, good luck even having the chance to speak to a human that can receive the message, let alone relay the message to the right branch and team to you.


It doesn't talk about neurodivergent people, but about non-brain damaged people, whose brains can still do dimensionality reduction, which allows them to deal with problems efficiently.

The so called neurotypical people are people brain damaged from something, who slowly took over the society over the 20th century. Without dimensionality reduction, virtually everything they encounter is an overwhelming, highly dimemsional problem so they can't deal with anything, except for a narrow set of problems they specifically trained to deal with.


This is a large claim to make without any evidence that is eerily reminiscent of historical arguments in favor of eugenics programs. Society thrives off of diversity and variations in thinking patterns. Dividing all people into either neurodivergent vs neurotypical or in your preferred terms “brain-damaged” vs “non-brain-damaged”is a vast oversimplification of the reality. Your claim about neurotypical people taking over society in the 20th century isn’t supported by evidence that suggests that genetic markers in humans for things like Autism Spectrum Condition are downsampled in humans relative to other species suggesting that human evolution has selected for some of the traits of what we classify as neurodivergence while balancing out the effects of some of those traits [1]. I’m not trying to say there’s anything wrong with neurodivergence (that’s how I’ve been classified) but this dichotomy is dumb. Everybody is neuro-divergent and what we define as neurotypical is societally defined. You’re just trying to flip the idea of what’s assumed to be a “good” person and that need to declare one group as better than the other is the actual problem. Please read more about the wider variability in cognition among humans before claiming that anyone in your preferred definition of “neuro-divergence” is superior to others.

[1] https://academic.oup.com/mbe/article/42/9/msaf189/8245036


I'm not talking about genetics. There must be something that causes brain damage (and it would probably be a good idea to find it and stop it). The change was too fast for genetics. It was essentially one generstion mostly normal, the next one brain damaged.

Can you provide evidence for what you’re saying? How are people “brain-damaged” at greater rates than before?

I'm talking about all the mid 20th century talk about the "generation gap", "teenage rebellion", and so on. Essentially no stone was left untouched in how the society worked between the "conformist" 50s, and the 70s. Something happened then, or a bit earlier, and it's still doing the damage.

This checks out

What?

I think it isn't a mixing issue, it's an acting issue.

It's the obsession with accents, mixed with the native speakers' conviction that vowels are the most important part.

Older movies tended to use some kind of unplaceable ("mid atlantic") accent, that could be easily understood.

But modern actors try to imitate accents and almost always focus on the vowels. Most native speakers seem to be convinced that vowels are the most important part of English, but I think it isn't true. Sure, English has a huge number of vowels, but they are almost completely redundant. It's hard to find cases where vowels really matter for comprehension, which is why they may vary so much across accents without impeding communication. So what the actors do is that they focus on the vowels, but slur the consonants, and you are pretty much completely lost without the consonants.


The Mid-Atlantic accent has fallen out of favor since at least the latter part of the 50s. The issue with hard to understand dialog is a much more recent phenomenon.

I have a 5.1 surround setup and by default I have to give the center a boost in volume. But still you get the movie where surround (sound effects) is loud and the center (dialog) is low.

>Most native speakers seem to be convinced that vowels are the most important part of English

As a native English speaker studying Spanish, my impression is that English cares about the consonants and Spanish is way more about the vowels. YMMV


It isn't only about the degree of wrongness, but its type:

1. Vacuous, which provide no useful insight beyond what is obviously deducible. Such as "nearsightedness is caused by the wrong shape of the eye".

2. Vanity, which provide useless elaboration of something that is very well understood in a much simpler form, with no realistic hope of any future insight. Such as most of linguistics.

3. Pointless. Explain something that is difficult to get to know, because it matters so little. While technically correct, the actual facts matter so little that they result in no realistic improvement of any kind, and no decisions are changed as the result of the new knowledge. Such as the age of Earth.

4. Theoretically wrong, those that the article is talking about. Even though theoretically wrong, the are so nearly equivalent to the actual truth, that the difference doesn't matter in practice.

5. Practically wrong. Those that "sound good" so that people stick to them, in spite of massive evidence to the contrary. Such as that obesity is caused by overeating, in spite of the near universal failure in practice, in the last instance of Ozempic making people look like walking corpses, rather than anything like a healthy body. This is the kind of errors meant by those who write to people like Asimov.


I firmly believe, more and more each day, that the human mind does not seek truth or correctness, it primarily seeks satisfaction.

The context for satisfaction is different for every individual human. Some parts of the context are shared (to various degrees). These 'shared contexts' we might call rationality, or science, or society, or religion.

Another part of the problem is that satisfaction is recursive.

We may evaluate something based on:

    1. Correctness
    2. Completeness
    3. Satisfaction
This is obviously self-referential because if something is incorrect or incomplete, then it is also unsatisfying.

For instance, if you are only aware of Electromagnetism, then Maxwell's equations are correct, complete, and satisfying. And then some jerk discovers neutrons.

Anyway, this whole comment may fit into your first three points; or it may help someone understand a failure to communicate.


Obesity is caused by overeating though, at the very least in the same sense that nearsightedness is caused by the wrong shape of the eye.


Energy intake > energy expenditure, but both intake and expenditure can vary for a variety of reasons uncorrelated with eating habits.

Sorry for being pedantic but I think that's what the above commentator meant.


> Energy intake > energy expenditure

Isn't that technically overeating though?


If someone ate the same three meals for 20 years and maintained a consistent weight, and then due to some physiological change started to put on weight eating the same amount, I'd find it reasonable but perhaps incorrect to attribute their weight gain to overeating. But yeah, technically at the end of the day, they could eat less :)


Yeah, it's not very useful in practice of course. Telling someone "eat less" has very unimpressive real world impact


You have made an interesting point but I think your arguments would have more force if you exercised some restraint in categorically stating your opinions about what is wrong and in what way as facts, basically.

Anyway, while I agree on these other types of "wrong" being important, I don't know about calling 1-3. wrong, per se. Also, I'm curious what part of linguistics you consider to belong under the "vanity" label, and why it would be apt to call "pointless" facts (like the age of the Earth) wrong.


> obviously deducible. Such as "nearsightedness is caused by the wrong shape of the eye"

Not obvious at all. According to Wikipedia it was discovered in 17th century. About only half century earlier than the discovery of bacteria.


While far-sightedness can be caused by shape, OR loss of elasticity in the lens as one ages.


> 3. Pointless. Explain something that is difficult to get to know, because it matters so little. While technically correct, the actual facts matter so little that they result in no realistic improvement of any kind, and no decisions are changed as the result of the new knowledge. Such as the age of Earth.

It's bizarre to even consider that investigating the age of the Earth is "pointless". Finding out the age of our planet and other celestial bodies matters a lot in astronomy! Understanding the universe is the opposite of pointless, it's fascinating.

Or did you mean something else?


It's the other way round - let's say you have a thousand inputs, and those contain 1 bit/s in a way that you can't pick one or a few inputs that allow you to correctly decode that one bit of information.

Or let's say you have a million inputs, or pixels, and you need to determine if there is a cat in the picture. Selective attention won't work for that either. You can't pick six pixels that allow you to reliably answer this question.

You need dimensionality reduction, that will reduce the data into a manageable amount of abstract features, from which you can pick what features matter to you.

Neurotypical people lack (or only have remnants of) this second filter.


To be clear, I'm not proposing any specific mechanism for or details of the difference in input filtering, just that this seems to be central to the impact of autism. (And, of course, it seems very likely that there is no "one true mechanism" here, just as autism has no "one true presentation".)

> Neurotypical people lack (or only have remnants of) this second filter.

This statement didn't scan for me, did you mean "autistic" instead of "neurotypical"? (And "neurodivergent" doesn't fit, there are many more neurodivergences than just autism.) Everything I know about neurotypicality does indicate something like that "second filter" is present (if possibly distributed).


Both have the selective attention filter. Autistic people also have another, dimensionality reduction filter.

So in autistic people it goes (raw sensory data) --> [dimensionality reduction] --> (latent space of abstract features) --> [selective attention] --> (higher thinking)

While for neurotypical people it goes (raw sensory data) --> [selective attention] --> (higher thinking).


Positing the emergence of an entirely distinct layer of processing is a pretty extreme divergence. It goes counter to a lot of established theory and, in general, seems like too big of a difference. Abstract feature determination hardly seems to be unique to neurodivergence.

I think any common neurodivergences that still result in more-or-less functioning adult brains are going to appear as different weightings or emphases, not entirely different effective structures.

One possible contribution to the behavior you are seeing is that sometimes brain behavior that is missing or ineffective gets lifted to the cognitive (conscious or semiconscious) layer. The cognitive layer of the brain can do, essentially, anything. But it has to work a lot harder than the lower layers, so you notice it a lot more. This can result in "squeaky wheel" syndrome, where one counterintuitively notices only the things that aren't working as well as they could be.


The filter is obviously supposed to be there, it's "neurotypical" people who have a problem.

It means you need to use your higher thinking to do sensory processing. It's like if your GPU had burned out. So you had to find a way to do everything on the CPU. It sucks, and you don't have much capacity left for the actual thinking.


>lack (or only have remnants of)

I'd say have been successfully precluded from developing that.

Instead, they've learned to substitute its functionality with the quasi-religious faith that they are actually any good at inferring what others think. (Take that precept away and see em flail, it's disturbing.)

At population scale, this resolves to either mass violent panic or a society living under the constant self-fulfilling prophecy that fewer things are thinkable than those which are possible, while screaming that it's the other way around. (Instead of, you know, aiming for the parity between interpretation and reality which is necessary to accomplish anything at all.)

The main neurotypical trait is lack of inherent revulsion to delusion.


> delusion

I hope you see the irony some day.


Why, how would that help you?


Because I have empathy for other people (on top of an innate sense of smugness)


Empathy is so where it's at. Especially after a few decades of rats in boxes taught the corporate parrots to bring themselves to say the word.

So, then: when was the last time you expressed empathy, and whom did it help?

And also: when was the last time you expressed empathy for someone you were told you should have no empathy for?

Cmon, give us that sweet, sweet emotional vulnerability. Because, surprise for whoever's not looking: insisting that a legitimate society can be built on the magical belief that performing an emotional emulation somehow equates to giving a damn about someone's actual well-being - that kind of thing is a big part of what's wrong with you guys.


Those are some pretty extreme views. I hope you find a place in this world where you fit in. And I also hope that I am very, very far from that place. It doesn't sound like it would make for a society that matches my values.


This is an extremely typical reaction.

Hey, I understand if what I'm saying is making you upset. Not necessarily why, but let's say I can imagine.

Your countersignal references "a place in the world to fit in", this presumes a strong belief that the world fits together, right?

You're pretty invested in the view that social consensus is essentially fair, and does not, for example, hinge on tragicomical amounts of epistemic sleight-of-hand, or anything like that?

You'd find living with that sort of awareness kinda depressing, over the long run prone to lead you to what they call them bad places?

Yeah, well.

In any case. What it would be helpful of you to be aware of. And I don't usually mean things in that sort of sense anymore - but this time I do mean it in the sense of "do this to make the world a better place" helpful: please remain mindful that the world that gave my "views", if they can even be called that, is the same one that gave you yours. There's no essential difference, beyond the paths we've walked through it (which I'm happy to confirm are different enough that you're completely safe from any harmfulness that you've been so kind as to proactively ascribe to me, thank you very much, jfc)


I don't think our views of the world are similar enough to have any sort of meaningful discussion. Good luck (sincerely).


Our views of acceptable utterances differ, which is basically the same thing, right? All the best to you as well :-)


slowpoke edit: s/expressed/experienced/


Unthinking the unthinkable is typical delusion elision.


No, the core difference is the level of abstraction, and the inability to communicate between people with different levels of abstraction.

The opposite of autism is schizophrenia, where abstract thinking fails completely, and the person is unable to find correct answers to everyday problems.

Neurotypicality is merely a socially acceptable level of schizophrenia.

Autism results when the level of abstraction is significantly higher than the surrounding society:

You can't automatically understand the concrete speech, and you especially can't understand the "implications" that rely on the concrete magical thinking.

People can't understand you, because their level of abstraction isn't sufficient to understand the actual meaning, so they assume you talk about something random.

People overread the gaze of more abstract thinkers, and underread the gaze of less abstract thinkers, due to the difference in the field of view.

Compare Taylor Swift (ultra concrete, easily understood by neurotypical people) vs Rihanna (very abstract)


> The opposite of autism is schizophrenia, where abstract thinking fails completely, and the person is unable to find correct answers to everyday problems.

I've heard this one before, but expressed in causal fallacy terms.

Correlation doesn't imply causation, but it's often right to assume a causal relationship even though you are unaware of the underlying mechanism.

If your criteria for assuming causality are too lax, you are schizophrenic (seeing relationships that aren't there), if they are too strict, you are autistic (missing the obvious).


This is correct, except the last paragraph.

Indeed, corellation equals causation for the schizophrenic.

Which is why you often don't get neurotypical people - when they say A, they often mean to imply B, because they correlate.

You are not too strict when you're autistic, you simply hallucinate less.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: