Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't get why the AI scrapers need to scrape the same sites and pages over and over again.




Because the scrapers are poorly written. Efficiency is not a concern for them.

Feels like parasites who are killing their hosts by DDOSing them to death



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: