Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
red2awn
5 months ago
|
parent
|
context
|
favorite
| on:
Defeating Nondeterminism in LLM Inference
Conceptually setting temperature to be >0 doesn't actually introduce any non-determinism. If your sampler is seeded then it will always choose the same next token. Higher temperature only flattens the logit distribution.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: