Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
slashcom
on Aug 20, 2019
|
parent
|
context
|
favorite
| on:
GPT-2: 6-Month Follow-Up
fp16 saves a
lot
of memory and is worth doing. I've not had trouble fine tuning all these models with fp16.
sdan
on Aug 21, 2019
[–]
Have you fine tuned 774 successfully using a single GPU?
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: