While doing research on robot.txt files, which sit on a web site and tell visiting search engine spiders what pages and directories to skip when they index your pages, I found this article at codeulate.com regarding the new whitehouse.gov robot.txt file.
Two days ago, under the Bush administration, the file listed over 2400 directories that were not searchable on whitehouse.gov. Paranoid much?
Under Obama, one.
Before:
Your browser does not support iframes.
After:
User-agent: *
Disallow: /includes/