Yesterday, the robots.txt file for whitehouse.gov had ca. 2400 lines worth of files and directories that search engines were not allowed to index. Today, the file is two lines long: "User-agent: *" and "Disallow: /includes/"
User-agent: *
Disallow: /cgi-bin
Disallow: /search
Disallow: /query.html
Disallow: /omb/search
Disallow: /omb/query.html
Disallow: /expectmore/search
Disallow: /expectmore/query.html
Disallow: /results/search
Disallow: /results/query.html
Disallow: /earmarks/search
Disallow: /earmarks/query.html
Disallow: /help
Disallow: /360pics/text
Disallow: /911/911day/text
Disallow: /911/heroes/text