The thing I always tell those who heavily trust its output is to ask it something you either already know the answer to or are something of an expert in; the flaws become much more evident.
The old joke is that you can get away with anything with a hi-vis vest and enough confidence, and LLM's pretty much work on that principle
The old joke is that you can get away with anything with a hi-vis vest and enough confidence, and LLM's pretty much work on that principle