what whitehouse.gov doesn’t want you to see

A robots.txt file is a small little document put in the root directory of a web server to specify to search engine crawlerbots what files/directories should be excluded from being indexed for internet searches.

the white house has one of these too.

http://www.whitehouse.gov/robots.txt

copy the link into a new browser window and see for yourself. if you have any idea why those particular files/directories are on the searchbot blocklist, drop me a line, cuz i’m rather curious…

CategoriesUncategorized