If your robots.txt file is /allow then you did. If you have no robots.txt file then it's an open question. If you put a /deny into your robots.txt file Google will stop scraping your site.
The implicit contract is that you let them scrape because you want to show up in their search results which will send you traffic. If you don't care about Google traffic then set /deny in your robots.txt and get back the bandwidth you were giving them.
> If you have no robots.txt file then it's an open question.
Only for definitions of explicit I must be unfamiliar with.
If the presence of a robots.txt makes one's intent for a given resource explicit one way or the other, the lack of one (and the lack of some communication in some other channel) must mean there is no explicit permission.
That is correct, for what it was worth IBM's legal team came down on the side of 'assume deny' and Google was (at the time I was there) 'assume allow.'
Google scrapes several of my sites and I've never given Google explicit permission to do so.