Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why wouldn’t the fake output mimic real human output after some light editing?

And it’s not like a live video call, where a few seconds delay would be noticeable.

There would still be a real human being, just as smart as you, behind the LLM, but pretending to be say 20 different lower skilled people.



Yes it would mimic real human output. That's why we have to verify if the subject matter is important. Knowing who to trust or not is not easy, if we have to be sure we have to do some work.


I still don’t get how this helps you differentiate between them… or do you mean to assume both are genuine human outputs regardless, if the information proves to be genuinely true?


If the information proves to be true, what does it matter?


Because your interlocutors would be 20 people that don’t exist… and would take up 20x more of your time than a single low skilled individual.

Yes, even scoundrels may supply to you true information for some period of time but eventually they will try to obtain their actual goals…




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: