find-a-bot.sh – a nice little script to ID bots bugging your website site

a nice little script to ID bots bugging your websiteAlready demonstrating earlier this week how to block spambots and rogue spiders. Today I’m completing the lesson with a nice little bash script sample that can help you identify some of these non-browser ‘candidates’ by parsing your access logs and placing the results in an easy-to-read text file.

In other words, this script will selectively find most non-browser user agents that appear in your access logs like this:

24.190.239.220 - - [29/May/2008:05:16:19 -0700] "GET /about HTTP/1.1" 200 628 "-" "Java/1.6.0_06"
79.71.205.134 - - [29/May/2008:00:56:34 -0700] "GET / HTTP/1.1" 200 12888 "-" "Site Sniper Pro"

And turns it into a slightly saner and sorted output like this:

24.190.239.220 [29/May/2008:05:16:19 "Java/1.6.0_06"
79.71.205.134 [29/May/2008:00:56:34 "Site Sniper Pro"

Here is what your bash script might look like on a site running WordPress on shared host like DreamHost … I’ll explain some of the mechanics afterwards:

#!/bin/bash
#
# step 1 - modify these so you get paths like this:
#   /home/YOURROOT/YOURDOMAIN.coM/...
#
myroot="YOURROOT"
mydomain="YOURDOMAIN.COM"

#
# step 2 - leave alone if these days & formats work for you:
#
TERM=linux
export TERM
tdy=`date +%d%b%y`
ydy=`date -d '1 day ago' +%Y-%m-%d`
dby=`date -d '7 day ago' +%Y-%m-%d`
logfile="access.log.$ydy"

#
# step 3 - modify if you're using something other
#           than  WordPress on DreamHost
#
outfile="/home/$myroot/$mydomain/findabot"
logpath="/home/$myroot/logs/$mydomain/http/"
csspath="/home/$myroot/$mydomain/wp-content"

#
# step 4 - mother of all parsing statements, parse to taste
#	(note this version DOES sort)
#
# 	remember \ at the very end of line equals
#	bash line continuation of a command set
#
grep "$csspath" -v $logpath$logfile | \
  egrep " \"(Mozilla|Opera)\/[0-9]| \"BlackBerry[0-9]{4}" -v | \
  perl -l -a -n -e 'print $F[0]," ",$F[3]," ",$F[11]," ",$F[12]," ",$F[13]' | \
  sort -n > $outfile/$ydy.txt

#
# step 5 - maintain a manageable archive
#
if [ -e $outfile/$dby.txt ]; then
	mv -f $outfile/$dby.txt $outfile/bak.txt
fi

Okay, step 1 basically means you login to your site either SSH or even FTP and before navigating anywhere, issue the “pwd” command so you can determine your YOURROOT and YOURDOMAIN (though the latter may likely be your website’s url).

Step 2 is how we get date stamps for our input and output files. I found a nice simple example of date variable formatting of these over on an ExpressionEngine manual – but they’ll work in your bash script just fine.

Also, that line containing “7 day ago” can be modified to indicate how many days worth of logs you want to keep active. Similarly, the prior line containing “1 day ago” means you want to parse yesterday’s logs.

Step 3 is basically how I use variables to define file and directory paths based on what I coded for steps 1 and 2.

Step 4 combines all the elements from the above steps and taking a page out of my April 2nd article entitled ‘How to quickly check your error logs for oddities‘ issues a consecutive stream of grep and/or egrep commands.

Sometimes leveraging the ‘-v’ command to exclude elements, most noteably when I’m excluding known user agent strings for browsers.

This done, a bit of PERL command line magic is used to parse out the fields we want, where afterwards the selected data is sorted and piped into the output file defined in step 3.

Step 5 takes into account that logs can get big, so this is where we manage an archive … based on step 2 … for 7 days worth of entries.

find-a-bot gets into the bits and bytes of web site bottageIf you’re not familiar with creating bash scripts, you may encounter situations where you need to “chmod” or even “chown” the file to get it to work.

The next step – though not documented above – is to test the script and when you’re sure it’s working, modify your crontab file so your batch runs every night, like say 2:15 AM while you and everyone else are sleeping. Here’s what my crontab entry looks like:

15 2 * * * /home/YOURROOT/find-a-bot.sh > /dev/null

I’ve provided a .txt version of the file you can simply download from here.

Moreover, I’ve created a slightly more complex version to download of the above for use on a system running a something like vBulletin on a root or virtual private server operating with Fedora or RedHat.

The point is, while the above appears a bit complex, I can assure you it’s worth running as it can help you quickly discern over the course of a few days:

  • how often and how hard spambots are sniffing your system
  • how much of your bandwidth is consumed by feed readers versus browsers
  • which feed readers are hammering away at your site, ignoring your <skiphours /> and/or <skipdays /> data
  • how much bandwidth you might save by exporting your sermon’s RSS feeds to a service like FeedBurner
  • what spiders are ignoring your robots.txt file
  • tips on unusual visitors from interesting places from unique user agents
  • whether or not some of the comment spam is via “Mozilla-like”agents who botch their user agent string
  • how many of your visitors are infected with spyware
  • how many of your visitors are trying to hide their tracks by visiting you with an anonymous proxy firing blank user agent strings
  • how many spamblogs are leaching your compelling content

Like I said, it will require just a little bash script know how, so with that, I leave you with these tutorials:

Oh and if you’re nice and leave a comment, I might even email you a link to my own archive of greatest bot hits over the past few days.

Especially if you share your own scripting recipes for spotting bots.

Posted in Uncategorized

How to block spambots by user agent using .htaccess

How to block spambots by user agent using .htaccess .Spambots and spiders that ignore robots exclusion file can kill your site both in bandwidth and by potentially exposing information you don’t want ‘harvested.’ With that in mind, here is a quick-n-dirty guide to blocking spambots and rogue search engine spiders by using .htaccess. First the essential example codeblock, followed by a working example:

essential example codeblock

# redirect spambots & rogue spiders to the end of the internet
Options +FollowSymlinks
RewriteEngine On
RewriteBase /
RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} ^spambot
RewriteRule ^(.*)$ http://www.shibumi.org/eoti.htm#$1 [R=301,L]

Next is to read my article on how to quickly check your error logs for oddities … which should provide you with a list of all sorts of unusual user agents worth blocking.

With said list, all that is left to do is create a working version that instead of sending people to the end of the internet, blocks them outright – which is probably a better move then sending the traffic elsewhere:

real-world/working example

# redirect spambots & rogue spiders to the end of the internet
Options +FollowSymlinks
RewriteEngine On
RewriteBase /
RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} ^$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^EmailSearch [OR]
RewriteCond %{HTTP_USER_AGENT} ^Microsoft\ URL [OR]
RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector
RewriteRule .* - [F,L]

Note I provide 4 examples:

  1. ^$,
  2. ^EmailSearch
  3. ^Microsoft\ URL
  4. ^Web\ Image\ Collector

All to demonstrate how to use perl-like regular expressions parse out the user agent. For example:

  1. ^ – identifies the beginning of the user agent string
  2. $ – identifies the end of the user agent string
  3. \ – that is a slash with a space afterwards tells the parser to include the space between words
  4. [OR] – is placed after each of the multiple entries, except the last
  5. [NC,…] – is sometimes placed after an entry to scan it w/out concern to upper or lower case

In the process, I’m intentionally blocking empty user agents using .htaccess – “^$” – a search string that uses a regular express to test for nothing between the beginning “^” and end “$” of a user agent token. Sorry, but if you’re not willing to tell me who/what you are, I’m not willing to show you my content.

Also, be aware the above requires that you have mod_rewrite installed on your Apache server, and that you have privileges to create your own rewrite rules in your own .htaccess file. If you’re not sure, check with your hosting service and/or system administrator.

In most cases, such privs & access exists – but your mileage may vary – as they might in how your particular .htaccess file actually works in-the-wild.

That said, more tomorrow or Thursday on how to create cron job to list those “unusual user agents” ‘automagically‘ for easy identification – and if needed -anti-spam remediation.

Posted in Uncategorized

5 things we can learn from the office candy machine

Office vending machine filled with stuff nobody wants.I just overheard a useful conversation between two vending machine operators while loading up our office junk food dispenser with a bunch of products that didn’t sell last week. It is if nothing else, an object lesson in contrast to my oft quoted aphorism “solve their problems, don’t tell them yours.” Here are 5 things we can in turn do in contrast to improve the user experience on our church and/or charity websites …

… but first a bit of context.

Last week I was working late, so I went to the vending machine to purchase some sugar-free Dentyne chewing gum. The machine was out – so were most the low-cal, low-carb consumables. The next day I noted the machine restocked, only the empty slots instead of offering snack solutions for my health conscious office mates, instead the replenished racks included:

  • M&M’s – plain & peanut
  • Spicy Sweet Chili flavored Doritos
  • Mrs. Freshley ‘Original’ Jumbo Honey Buns
  • Mr. Piggy Fried Pork Rinds (I’m not making this up)

Today, while capping off my coffee cup by the adjacent caffeine delivery device, I noted two gentlemen again restocking the junk food dispenser when one of the machine operators – possibly in training – asked the other “how do you decide what to refill it with?” the other immediately replying “... with whatever I have too much of, that’s how I decide.

No, no, no, no, NO!

Now granted, both these individuals are likely a bit more than minimum wage employees to a large, regional vending concern, so I don’t expect them to understand what’s wrong with said answer. However, the proprietor for whom they work should – and should train and equip these hard workers with information to become smart workers – that is provide them with data sheets on what’s hot and what’s not for a given office vending machine.

Same rules apply for our church and charity websites. Here are 5 quick ones off the top of my head:

  1. Collect useful usage data – does your site have a mechanism for collecting useful usage data? Note, I said useful usage data. There are some counters and stats services that provide little more than a “hit count,” which in today’s age of search engines, aggregators, and spammers can give lead you down a path to decision perdition by including counts of visits by automated systems not interested in filling your pews.If you don’t know where to start, may I suggest incorporating Google’s Analytics on your website. Though not real-time, at the end of the day, it does provide you with a great idea on both what humans and bots are banging away at your site.
  2. Understand where the traffic is coming from – that is understand:
    • What search engines are sending the most traffic?
    • What keywords are being used with said search engines?
    • What keywords are being used on your own website’s search tool?
    • What other websites are sending you traffic?
    • How much direct traffic is there?
    • How much traffic is being driven from email applications?
    • How much traffic is being driven from aggregators?
  3. Understand what pages are hot – and also understand why pages are not. This should be pretty straight forward, but along with asking which page gets the most visits, also ask:
    • Entry pages – which page is the first e viewed by a visitor and/or which are the pages most attracting visitors?
    • Exit pages – from which page do visitors leave the most
    • Average visit duration – what is the total length of a user’s visit?
    • Average page duration – how long are pages is viewed?
    • Top path – what is the leading sequence of pages viewed by visitors from entry to exit?
    • Bounce rate – are there pages which users leave without visiting any other pages before a specified session-timeout occurs?
    • Error messages – where is the user experiencing some level of frustration due to errors? Information which could explain your high exit/bounce rates.
  4. Understand why pages are hot – In other words, regardless of which analytics tool you use, most can only point out WHAT pages are popular. That’s only half the picture. What you need to do is figure our WHY said pages are hot.Two examples: are two blogs I run, each of which have pages that appear to be popular due to their relevance to recent events:
    • HealYourChurchWebite’s most popular article is “How I fixed my Windows XP Stop c000021a {Fatal System Error} with Knoppix Linux” with the largest amount of traffic being driven in from Google searches on the keyword/phrase “Stop c000021a” … probably in response to she several crashes caused by a recent Windows XP SP 3 release that some speculate is responsible for said error.
    • BlogJordan.com ‘s most popular article is “The Petra Treasury Indiana Jones didn’t show you” – usually by means of Google and YouTube searches on the keyword/phrase “Petra Indiana Jones” and/or “Indiana Jones Petra” .. probably in response to recent showings of “Indiana Jones and the Last Crusade” on cable in response and anticipation for the release of “Indiana Jones and the Kingdom of the Crystal Skull.
  5. Establish and track conversion goals – I’m not talking about instances of baptisms or individuals asking Christ into their lives – though that is the ultimate goal here – rather I’m talking in this case of establishing goals and objectives for the website such as getting visitors to:
    • subscribe to the RSS feed for your sermon titles/series;
    • import your events calendar into your;
    • link your site on theirs;
    • fill out a “send me more information” form;
    • print your page that displays both directions and times of services; and/or
    • visit again, and again … and again.

In other words, it doesn’t matter how cool, how Flashy, how seeker-centric, how Y-generation, how usable, nor how XHTML compliant your pages are … if you don’t collect useful metrics on your church and/or charity online presence you’re as good as flying blind …

… which is okay until you crash into the mountain side of “no website visitors because you’re wasting gifts and talents on the wrong things …” or perhaps none at all as described in Matthew 25:

“… he who had received the five talents came forward, bringing five talents more, saying, ‘Master, you delivered to me five talents; here I have made five talents more.’ His master said to him, ‘Well done, good and faithful servant. You have been faithful over a little; I will set you over much. Enter into the joy of your master.’ …

… ‘Master, I knew you to be a hard man, reaping where you did not sow, and gathering where you scattered no seed, so I was afraid, and I went and hid your talent in the ground. Here you have what is yours.’ But his master answered him, ‘You wicked and slothful servant!”

Don’t be like sloth-boy, add and analyze your web usage now – unless of course you want your web vending machine full of stuff nobody is buying.

Posted in Uncategorized

ip2Country.pl – A fast little script to bulk id IPs by country

ip2Country.pl - a fast little PERL script to bulk identify IPs by countryYes, I know, all cool programmers use Python these days – but to this old-school programmer, PERL is to my antiquated PC what GWBasic was to first computer at work back in 1983. That is a nice little tool to get things done, like identify a list of IP addresses by country.

Here’s the situation, I’ve been getting a lot of incoming spambots attempting to create accounts and post comments both here on HYCW and a few other sites I help manage. The Akismet spam filtering service catches all of it – but there’s still at times a huge draw on bandwidth, CPU and other resources when these bots hit.

So from time to time, I harvest the IP addresses from the thwarted ne’er-do-well’s failed attempts via my user registration table and/or Apache logs and then add them to the firewalls, .htaccess file and/or application IP ban lists of these various sites – except for those IPs incoming from countries where both the languages and laws give me the ability to email the abuse administrator.

Moreover, by excluding IPs from countries like the US, Canada, etc … from my ‘hit list,’ I don’t accidentally banish entire ISPs such as RoadRunner, ComCast or AOL when one of their user’s machines goes z0mbie goes due to some malware.

So the trick is then to take all the IPs from all the computers with which I’m associated, and drive the list through a simple application that will generate a list of IPs to ban – while excluding IPs whom I can (and do) contact via email at a later time.

Which is what inspired me to write ip2Country.pl – a fast little PERL script to bulk identify IPs by countries who don’t have IP abuse administrators who care, and generate a bash script to insert the entries into my apf firewall deny_hosts.rules file:

#!/usr/bin/perl
#
# by Dean Peters
# http://healyourchurchwebsite.com/
#
use IP::Country::Fast;
use Geography::Countries;
my $reg = IP::Country::Fast->new();

print "#/bin/sh\n";
print "# -- append firewall --\n";
while() {
        chomp;
        my $ip = $_;
        my $ip_cntry_abr = $reg->inet_atocc($ip);
        my $ip_cntry_nam = country $ip_cntry_abr;
        next if ($ip_cntry_abr =~ m/US|CA|GB|AU|NZ/i);
        print "/etc/apf/apf -d $ip {mad spammer from $ip_cntry_nam}\n";
}
print "# -- restart firewall --\n";
print "/etc/apf/apf -r\n";

__DATA__
121.1.29.246
121.15.200.148
193.238.213.70
196.20.7.74
210.22.83.146
217.30.244.226
222.124.200.212

Oh sure, I could be real fancy and write a version that takes command line arguments for individual IP addresses and/or a file of IP addresses … but the point here was to demonstrate how a crufty old tool like PERL can help bulk identify IPs by countries so you can too add them to your firewalls, .htaccess file and/or application IP ban list.

That said, if you’ve got a Python or even PHP version of the same, leave a comment and share the goods.

Or you can just preemptively use the online services of Block a Country and be done with it.

Posted in Uncategorized

Presenting the WordPress Plugin – ObamMath

As a card-carrying member of the vast math conspiracy, and in light of Obama’s statement made in Beaverton Oregon last week that has been all but ignored by old media – I have decided to have some geeky-fun of my own by providing the blogosphere the tools required to accurately quote, report and interpret quotes such as the one made below:

It is wonderful to be back in Oregon. Over the last 15 months, we’ve traveled to every corner of the United States. I’ve now been in 57 states? I think one left to go. Alaska and Hawaii, I was not allowed to go to even though I really wanted to visit, but my staff would not justify it.”

I know, I know, I shouldn’t pick sides on this blog, but this gaff is just too fun to pass up (don’t worry, I’m sure McCain will offer equal opportunities for parody).

So it is with great programmatic pleasure that I present the WordPress Plugin : ObamMath, which once installed will:

  • convert all instances of the number 50 prefixed with the small letter ‘o’ to ‘o50’;
  • convert all instances of the word ‘fifty’ prefixed with the small letter ‘o’ to ‘ofifty’;
  • in the true spirit of bi-partisanship, grammatically correct any pluralized reference to the ‘states of Alaska and Hawaii’.

Not only will such a plugin help the new media continue to report stories squelched by the old media, but can and should help any tuckered-out Politician execute along the following objectives:

  • correctly identify the number of stars on the new line of swag such as on his new patriotic lapel pin;
  • introduce super-stealthy hidden taxes where all tax rates are increased by an additional 16%;
  • immediately increase the Federal minimum wage by $1.14 to $8.29;
  • empower the native-born Honoluluan with yet another special interest group by combining the Hawaiian and Alaskan cultures into a new protected demographic known as ‘Hawaskans’;
  • clear the way for the statehood of the District of Columbia, American Samoa, and Puerto Rico with plenty of headroom left for Mexico, Jamaica, Cuba, French Ontario, the 13th District of Chicago; and
  • head-off any hinky Hillary Al-Gore-ithms introduced by Michigan and Florida delegates.

Here is a link to download the plug-in, which with some luck, will require augmentation and expansion as the Jr. Senator from Illinois continues to make ‘macaca moments’ available to the general public as demonstrated in the YouTube video below:

[youtube:http://www.youtube.com/watch?v=EpGH02DtIws&autoplay=0 350 350]

In the meantime, I am going to seek Federal grant in case this application requires four years of ongoing maintenance.

Gad I love political satire.

Posted in Uncategorized

The price of .org domain names to increase by 10%

The price of .org domain names to increase by 10% Those putting off the purchasing and/or long-term renewal of a domain name for their church and/or charity because the price was too high may want to re-think that strategy as it appears that the Public Interest Registry (PIR), the registry for ‘.org’ domain names, will be raising their annual wholesale price for ‘.org’ domains by 10%.

In a May 1, 2008 communications with Internet Corporation for Assigned Names and Numbers (ICANN), the PIR disclosed the increase that would bring the annual fee to $6.75. Last year, PIR imposed a 2.5 percent fee increase to $6.15. PIR did not cite a reason in its letter – nor does it have to – as the PIR does not need ICANN’s approval for such increases.

What this means for the rest of us buying 1 or 2 domains at the retail level is likely a similar price increase – though it’ll be interesting to see how many domain name registries keep the increase to 10% of their current price, as opposed to marking up a bit more as I suspect some will.

As the Internet’s 6th most popular domain name suffix, it also likely means an increase in renewal for the nearly 7 million domain names already registered as ‘.org” – provided of course one’s domain name service hasn’t beaten you to the punch.

As for me, considering the is now very little price disparity between .com and .org, and taking into consideration the popularity and ease in which others remember the .com suffix, I’m thinking I might put my bucks into a .com domain if I had to choose between the two …

… tough I always STRONGLY recommend that any church and/or charity purchase both .org and .com domain name, for as long a period as they can afford it.

More info about the recent PIR rate increase can be found below:

Posted in Uncategorized

Is Church marketing dead? Nope, just stuck on stupid!

Bad church web design poster 007 - Stu·pid·i·tyThere’s no getting around it, despite the efforts of many to teach, rebuke, correct & train in righteous web design, there still exists a great cloud of witlessness when it comes to the Church’s presence online. A fact painfully corroborated by the persistent body of ‘kitsch‘ out there that distracts, annoys and otherwise drives-away people seeking and/or serving the Lord.

WHOOOOSH – flame on!

I should know, as I’ve been engaged in mental combat with these forces of evil web design idioms since the turn of the new Millennium, first in collaborating with Vincent Flanders on his second book, an act which lead to the eventually May 2002 establishment of Heal Your Church Website.

But enough of my credentials, lest I start this post start sounding like a Pauline epistle, though I should probably mention that I do the software as a service thing for a living … but that’s plenty about me, as others equally qualified have since dashed headlong into the breach.

This would include Cory Miller and James Dalman at Church Communications Pro (CCP), the latter of whom begged the all important question: “Is Church marketing dead?” Specifically, pondering aloud:

There is something going on but I quite can’t put my finger on it. It’s a gut feeling that’s right more often than not. I think the church landscape is drastically changing and that church as we know it now is going to evolve (no, I am not supporting Darwin) into something much different. It’s just a hypothesis or idea I’m working on, whatever that’s worth.

To which my response is: “James, let me save you a few steps. Church marketing isn’t dead, it’s just stuck on stupid!

If that weren’t the case, why would sites and services such as CCP, Church Marketing Sucks, For God’s Sake Shut Up, and a handful of others continue to, and with apologies to Vincent Flanders, offer weekly lessons in good church marketing by looking at examples of bad church marketing?

I mean, how many examples of church websites adorned with the cliché gold lamé animated gif of a spinning cross that screams “everything I know about website design I learned from Strong Bad!“ do we need to ‘Fisk‘ to make our point?

Or on a more serious note, how many of us seen all to many unique local churches dive through the porpoise-driven hoops of Warren-ology just to become different like everyone else?

And that’s really my point, it’s not that church marketing is dead, it is that we’re stuck on driving down the wide and easy path to church marketing, rather than seek out a difficult path that includes:

  • Studying Scripture to see where marketing and evangelism intersect;
  • Teaching lay staff and church on what real marketing is and how it works;
  • Understanding that the Church didn’t start in 2000, but rather 2000 years ago;
  • Putting aside the need to agree 100% with everyone 100% of the time;
  • Initiating marketing teams as opposed the message controlled at a single point;
  • Daring to be different without amputating one’s self from the Body.

Look, we have centuries of beautiful sacred songs, art and literature as the result of the artistry that was once the Church’s … why can’t we have the same for church marketing in the 21st century?”

Or put another way, Franky Schaeffer was right when he asserted that Christians are no longer influencing society through various forms of media, but are instead influenced BY it. A neat trick when you think Francis Shaeffer’s son warned us about this as far back as 1981!

WHOOOOSH – flame off!

Oh, and before I forget, to my Eastern Orthodox friends … a belated Χριστός Ανέστη!

Posted in Uncategorized

5 Things Eight Belles and Church Webmasters have in common

Eight Belles, like Church Webmasters gets put down upon injury Last night while listening to various speculations as why the horse that ‘placed’ at the Kentucky Derby was put down, my mind drifted to 5 things Eight Belles has in common with many church webmasters I know, including:

  1. Both endure through arduous training regimes;
  2. Both have blinders, bridles, and bits forced upon them;
  3. Both are ridden hard, often with tightly held reins;
  4. Both get whipped as they approach the finish line; and
  5. Both are often “put down” instead of retired or rehabilitated … though a good race horse at least has the hope of being put out to stud if they can survive without injury.

Brutal huh?

Yeah, sorry with the negative waves Moriarty, but I think what we’re dealing with has its basis in a somewhat larger quartet of problems that afflict the Christian church on the whole, they being:

  • turf wars
  • shooting the wounded
  • treating those who leave like traitors
  • non-Biblical power structures

Don’t get me wrong, any organization of people are going to suffer the effects of the sin nature; including the Church until Christ returns.

Still, my heart goes out to a good number of church webmasters who have privately emailed me with their stories of how shabbily they were treated somewhere along the process of designing, developing, deploying and/or maintaining a church website.

Heck, I’ve gone through it a bit too – having been verbally tongue-lashed by a church staff member publicly in the kitchen of the church (next to the sanctuary) before my last communion service there, followed up by a webmaster who couldn’t wait to “upgrade” the site from MovableType to FrontPage.

That aside, I tend to want to keep such communications private as I minister to a hurting church webmaster. I have though from time to time, both either with the permission and/or at the request of the pummeled programmer published their tale not in the spirit of bitterness nor gossip, but as experiences from which we can all observe, learn and hopefully avoid in the future.

Here are a couple of links to such posts:

As I alluded earlier, much of this has to do with our struggle with our sin nature, much of that manifesting itself in what is commonly referred to these days as ‘spiritual abuse.’ There’s plenty on this topic all over the internet, but I am partial to the excellent writing on this malady that are found at Watchman.org:

This isn’t to say we should engage ourselves in the process of providing our valuable skills to assist a church and/or charity create and maintain an excellent web presence , but only and rather that we do so keeping in mind the following words of our Savior from Matthew 10:16:

“Behold, I am sending you out as sheep in the midst of wolves, so be wise as serpents and innocent as doves.”

What about you, what are your thoughts, experiences, and/or opinion on this topic? Leave a comment, in love.

Posted in Uncategorized

Making a Ready Defense by Planning for Failure

Bad church web design poster 0008 - contingency planningThose who fail to plan, plan to fail. While this aphorism is very worn, it is also very true. Here are some simple things you can do with mysqldump, crontab, tar/gzip and a little contingency planning to insure you don’t lose your sanity when your server crashes upon the shoals of of virtual disaster.

Check out these recent tales of real-life virtual horror as told by a variety of news sources from around the globe:

  • The outgoing Italian government posted the entire population’s tax returns on the internet causing a mad scramble which crashed the system.
  • Obama supporters were in for a surprise Monday when an attacker executed code on Barack Obama’s Presidential campaign Website that redirected users to Democratic rival Hillary Clinton’s campaign site.
  • According to police reports, a computer was stolen from the ADT Home Security branch on Sunbeam Center Drive sometime between April 12th and April 13th.
  • Tens of thousands of people were feeling short changed last night after a massive system failure wiped out all the Northern Bank’s ATMs.
  • A statewide computer problem again hobbled the state’s digital driver license system on Friday.

The point is, hardware failures, power outages, software bugs, stolen computers, cross site scripted SQL injections, and/or zombie induced denial of service attacks can all turn your church and/or charity website into a tub of techno-mush quicker than you can recurse a binary tree.

The only real defense against such failures is to plan for them – anticipating them in three ways:

  • backing up your data
  • moving your backed-up data off site
  • having and practicing how to restore backed-up data

Here’s a very simple snippet from an oldie but goldie article entitled “How to backup your MySQL tables and data every night using a bash script and cron:”

#!/bin/sh
# backup data
mysqldump -uroot -ppwd --opt db1 > /sqldata/db1.sql
mysqldump -uroot -ppwd --opt db2 > /sqldata/db2.sql
# zip up data
cd /sqldata/
tar -zcvf sqldata.tgz *.sql
# email data off-site
cd /scripts/
perl emailsql.cgi

The article also displays a script on how to email the data off site, not a bad deal if your data is small – such backups being just as simple to restore with this dynamic command line duo of directives:

tar -zxvf sqldata.tgz
mysql -uroot -ppwd db1 < db1.sql

Things get trickier when you have tons of data, in which it may play into one’s restoration plan better to backup and restore a database by individual tables. Here is a set of articles that describes how to do this that includes some script examples you can modify to suite your needs:

Either way, then it is just a manner of putting the shell script on a timer, or in the vernacular of crontab:

1 3 * * * /usr/home/mysite.com/prvt/tbak.sh > /usr/home/logs/tbak.log

If either of these shell script, bash-based approach seems to complex then perhaps one of the control panel, web-based method offered by UpStartBlogger’s post “8 MySQL Backup Strategies for WordPress Bloggers (And Others)” will do the trick.

Here are some other related articles that might help, the last two include automagic date stamping of the backup files:

The bottom line is this: just Peter implores us to make a ready defense in 1 Peter 3:15, so I’m asking you always be ready to make a defense to anything that endangers the data that is on your system so you’re not found tearfully dissheveled cowering in a corner meek and fearful, mumbling something about how you should have planned for such failures.

You’ll be glad you did – probably at the most inopportune time possible.

Posted in Uncategorized

10 Principles Of Good Church Website Design

Want to make sure your church website follows the principles of good church website design? Then stop coding that rotating Flash banner you think is cook and start learning how user-centric design has become a standard approach for successful websites with high conversion rates.

The article includes eye scan reports/models to check for page hot spotsAnd in order to use the user-centric designs that make for a good church website experience, we first need to understand how users interact with web-sites, how they think and what are the basic patterns of users’ behavior. A good place to learn how this is via a recent Smashing Magazine article entitled “10 Principles Of Effective Web Design.”

In this article, the author asks and answers the question “How do users think?” by correctly asserting:

Basically, users’ habits on the Web aren’t that different from customers’ habits in a [bricks-and-mortar] store …

… Most users search for something interesting (or useful) and clickable; as soon as some promising candidates are found, users click. If the new page doesn’t meet users’ expectations, the Back button is clicked and the search process is continued.

The reasonings for this point are simple issues of common sense:

  • Users appreciate quality and credibility;
  • Users don’t read, they scan;
  • Web users are impatient and insist on instant gratification;
  • Users don’t make optimal choices;
  • Users follow their intuition; and
  • Users want to have control

eye scanning - movement tracing results from the articleThe article expounds on each of the above points not only in word but in the deed of citing eye tracking scan paths of sample pages. The article doesn’t stop there, enumerating in detail 10 principles of good web design that we all can apply to our church and/or charity websites. The principles being:

  1. Don’t make users think
  2. Don’t squander users’ patience
  3. Manage to focus users’ attention
  4. Strive for feature exposure
  5. Make use of effective writing
  6. Strive for simplicity
  7. Don’t be afraid of the white space
  8. Communicate effectively with a “visible language”
  9. Conventions are our friends
  10. Test early, test often

I’ve made similar points, only not as concise and well outline as the aforementioned article.

So why not read it, then come back here for some discussion and comments on how the above points apply to your particular circumstance.

Posted in Uncategorized