As another example, I used to work on a site that was roughly hotel stays. A regular person might search where to stay in small set of areas, dates and usually the same number of people.
Bots would routinely try to scrape pricing for every combination of {property, arrival_date, departure_date, num_guests} in the next several years. The load to serve this would have been vastly higher than real customers, but our frontend was mostly pretty good at filtering them out.
We also served some legitimate partners that wanted basically the same thing via an API... and the load was in fact enormous. But at least then it was a real partner with some kind of business case that would ultimately benefit us, and we could make some attempt to be smart about what they asked for.
Bots would routinely try to scrape pricing for every combination of {property, arrival_date, departure_date, num_guests} in the next several years. The load to serve this would have been vastly higher than real customers, but our frontend was mostly pretty good at filtering them out.
We also served some legitimate partners that wanted basically the same thing via an API... and the load was in fact enormous. But at least then it was a real partner with some kind of business case that would ultimately benefit us, and we could make some attempt to be smart about what they asked for.