Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A few more clues which are demonstrated, but not explained, by ^this response:

* Begins with a slight rewording of your question, rephrased as the introduction to the answer

* Ends with at least a sentence or two of CYA-type caveats, like "ultimately it will depend on", "there are many factors", etc.

* Writes in full sentences even when not really necessary - eg. this bullet point would be phrased as "Text generated by a language model would be more likely than average to be written in full sentences..."

* Avoids pronouns when any ambiguity is present, even if it would be obvious to a human. eg. If the first sentence is: "Robert and Joe were playing football when Joe injured his hand", a human might continue: "He tried to keep playing but it hurt too badly." ChatGPT would be more like: "Joe tried to keep playing but his hand hurt too badly".

(this was NOT written by ChatGPT :)



The regularity that it builds answers in this form makes me think they must have examples of it embedded in the prompt




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: