Author: A.Degives Mas (20 Apr 11 4:09pm)
Sidenote: I'd be careful not giving Google (or any other large SE) a pass just because an IP happens to fall within their address space. I see a lot of poisoned search submissions used as a probe for certain features on my sites. Lately I'm seeing e.g. Google App Engine being added to the mix, which can be sorted out (i.e. blocked) on user agent. So, I'm actively hammering away Googlebots from my site if they request a non-canonical URI with a stern 403. Sadly it's not remedy for the fundamental problem, but at least Google is learning, so over time they'll get it right.
Bottom line, Google & Co shouldn't be whitelisted just because it's Google (or Yahoo or Bing or...) when they schlep in cruddy traffic.