Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not for a typical website. Even if you're getting a massive amount of traffic, like a million requests per second, you'd want the request to finish as quickly as possible, so they never pile up to be 1M concurrent tasks. If the tasks have real work to do and can't finish quickly, the having 1M in-flight tasks compete for orders of magnitude fewer CPU cores may not be a good idea — it will waste lots of memory for their state and it will be hard to keep latencies reasonably even. You'd probably use a simpler queue and/or load-balance to more machines.

A scenario with this many tasks spawned is more realistic if you have mostly idle connections. For example, a massive chat server with tons of websockets open.



You might not have a choice. How should a load balancer be implemented? It has to keep those connections alive while the underlying services fulfill the request.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: