Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

maybe this is an ignorant question, but why do people always cite GC latencies in seconds? is there an implied reference machine where the latency is 300ms? I would expect it to vary a lot in the wild based on cpu frequency and cache/memory latency. is there some reason why this doesn't matter as much as I think it does?


How else is it going to be cited?

People understand seconds, and any other measurement would require specifying a lot of computer-specific stuff, and if you're going to do that, then you might as well fully specify the workload, too, to answer those questions before they come.

It isn't meant to be a precise answer, it's meant to put the GC performance broadly in context.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: