I think so, yes. We rely a lot on eloquence and general knowledge as signals of competence, and LLMs beat most people at these. That's the "usually" -- I don't think good human bullshitters are more obvious than LLMs.
This may not apply to you if you regard LLMs, including their established rhetorical patterns, with greater suspicion or scrutiny (and you should!) It also does not apply when talking about subjects in which you are knowledgeable. But if you're chatting about things you are not knowledgeable about, and you treat the LLM just like any human, I think it applies. There's a reason LLM psychosis is a thing, rhetorically these things can simulate the ability of a cult leader.
I think I'm going to have to disagree. When people tell you something incorrect, they usually believe it's correct and that they're trying to help. So it comes across with full confidence, helpfulness, and a trustworthy attitude. Plus people often come with credentials -- PhD's, medical degrees, etc. -- so we're even more caught off-guard when they turn out to be totally and completely wrong about something.
On the other hand, LLM's are just text on a screen. There are zero of the human signals that tell us someone is confident or trustworthy or being helpful. It "feels" like any random blog post from someone I don't know. So it makes you want to verify it.
This may not apply to you if you regard LLMs, including their established rhetorical patterns, with greater suspicion or scrutiny (and you should!) It also does not apply when talking about subjects in which you are knowledgeable. But if you're chatting about things you are not knowledgeable about, and you treat the LLM just like any human, I think it applies. There's a reason LLM psychosis is a thing, rhetorically these things can simulate the ability of a cult leader.