Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why wouldn't a max-depth (which I always implement in my crawlers if I write any) prevent any issues you'd have? Am I overlooking something? Or does it run under the assumption that the crawlers they are targeting are so greedy that they don't have max-depth/a max number of pages for a domain?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: