Archive for the ‘Site Specific’ Category

Moving Day!

I’m moving the blog! The new blog is going to be found at:

The last month of news posts has been moved to the new site along with your comments. Most everything should be in tact there. I will be adding new features like polls to the new site for some more fun and goofyness. Keep an eye out over there!

Who are the faceless masses?

Ok so I’ve been doing this blog thing for a while now and I used to do it long before it had a name. Back then it was just writing for a personal website. Shoot I should have named it WFPW or something catchy like blogging. Ah well can’t win em’ all. One thing I have not had much luck in is really having contact with my audience. For some reason I rarely get comments and if I do it’s from some one I all ready have day to day contact with. So if you read this blog please,  please, leave me comments or shoot me an e-mail. Let me know what you want to see me write more about. Let me know what you think about some of my thoughts and ideas. And to all of you who keep coming back thank you for reading and I hope some of the writing has been either helpful or entertaining. Keep an eye out for more.

I’m in the process of moving in to a new apartment and making the transition in our work rotation as well as preparing for a command inspection at drill this weekend. So if I don’t write some time in the near future please keep coming back and checking I will write again soon.

Robots pwn3d my site

I’m not sure how many of my very small number of readers out there have to administer a real website but this is something that I’m sure is kind of ‘duh’ but I even managed to over look. Robots.txt is a very important file especially on a bandwidth limited server. I knew it was there and I kind of knew what it was used for but I hadn’t imagined how important it is untill the Campus Bible Fellowship website got pwn3d by Googlebot. Googlebot consumed just over 2GB of bandwidth in less than a month. Luckily we have a good amount of bandwidth available(2.5GB I think) a month for a relatively small website. There was a little panic when I started getting the e-mail notifications that the site was approaching its bandwidth limit. Luckily we didn’t hit this point till late in the month. So this brought back to my attention the need for robots.txt.

Robots.txt is a file that resides in your web root. The file is read by well behaved robots or crawlers. The automated things that go out and scan the internet to add information to the search engine databases. This file dictates what a robot can and can not do. You can define and constrain the folders and files that a robot is allowed to look at with in the file by a certain schema that is very easy to understand. It supports things like wild cards etc.

So the moral of the story here is do NOT ignore robots.txt as though its some formality that can wait till later. The same should be said for any security measure designed to protect what you plan to hang out there for all the world to see and it would seem also abuse.


So I’m trying this blog thing again. We’ll see how much time I’m actually able to find for it. I hope I can do enough thinking and find enough time to write to turn this in to a topical sort of thing like Think Christian is. We’ll see how successful this is. I see that I can have other admins/posters so may be if I talk Evonne or some one else in to writing we can do a joint thing. We’ll just have to see how this goes. Please check back every once in a while or register for e-mails on updates as I hope to be adding things fairly frequently.