dave miller from portland writes in something about evil robots in the whitehouse

I meant to post this last week, but I never got around to it. Its from my Monkey Power Trio band-mate Dave.

I was perusing the news this afternoon and thought I’d pass on some links:

Here’s some interesting stuff on Bush manipulating the internet in his favor.

And this link is more interesting since it’s still on the WhiteHouse.com server.

First, some background:
Search engines gather information about websites by having computer programs automatically surf the net, log the links, and log the contents of the pages. There is a protocol for websites to respectfully request to these programs that certain pages not get logged. The convention is that you place a simple text file named “robots.txt” on your website and list the links you don’t want archived. Presumably, the robots will skip the pages you want them to skip and those pages won’t turn up in any search engine results.

So, what information does the current administration not want archived (in case they need to revise past statements for example)? Take a guess. And check out your government’s robot.txt file.

This entry was posted in Other. Bookmark the permalink. Trackbacks are closed, but you can post a comment.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Connect

BUY LOCAL... or shop at Amazon through this link Banner Initiative VG 3D