I’m such a dufus!

Well, sometimes I am.

Several weeks ago—perhaps even longer—I was tinkering with the site’s robots.txt file.  My intention was to add a few new exclusions for those robots that actually read and adhere to its directives.

One thing I did was disable the Snap preview for my site.  I happen to hate that thing, and I especially don’t like that it caches my site in order to generate the preview.  I’ve disabled all caching of my site for all robots, yet Snap refused to follow that directive.  So I disabled it entirely.

But in the course of editing the file, I apparently started a new restriction for all robots that I forgot to finish.  I ultimately placed the updated file on the server with a “Disallow: /” for “User-agent: *”—a move that, in effect, disabled all robots from indexing the site at all, including Yahoo!, Google, and everything else.

Um…  Major oops there!

I truthfully don’t recall when that happened other than it was at least several weeks ago.  That means for probably a month or more this site hasn’t been indexed in any of the legitimate search engines.

After discovering that mistake today, I removed the restriction and uploaded the new file, so I would guess my index entries will be updated in the next day or two (most check the robots.txt once per day and act accordingly).

Meanwhile, I’ve locked the crack pipe in a cupboard and thrown away the key in hopes the move will prohibit such egregious errors in the future.  I’m not holding my breath…

Leave a Reply