Thanks everyone. I just ended up writing my own thing. All it does right 
now is take a list of URLs and an optional string to match, and outputs a 
nice HTML file.


On Tue, 23 Sep 2014, canito at dalan.us wrote:

> I was just reading about this tool today for monitoring which seems 
> interesting.
>
> Not sure if has been mentioned or not, below is their site.
>
> https://collectd.org/wiki/index.php/First_steps
>
> Thanks.
> SDA
>
> Quoting Justin Krejci <jus at krytosvirus.com>:
>
>> I wrote something like this years ago with perl LWP. It's actually quite 
>> simple. I ran it as a daemon and had a number of actions it could perform 
>> based on it's previous results (ie up before up now, up before down now, 
>> down before down now, and down before up now). Then I ended up branching 
>> that off into some simple cli tools (later cgi'ed as well) into 
>> http-header.pl and http-getter.pl scripts to make head and get requests 
>> respectively which occasionally are very useful for troubleshooting. I even 
>> added the heartbleed check in for a bit before the chrome bleed plugin was 
>> released. 
>> 
>> Something relatively straight forward like this I like to do myself so I 
>> can make fine tweaks to match my needs exactly instead of trying to string 
>> multiple different scripts/tools together and settle on "good enough"
>> 
>> Just my $.02
>> 
>> 
>> 
>> 
>> <div>-------- Original message --------</div><div>From: 
>> tclug at freakzilla.com </div><div>Date:09/22/2014  10:07 PM  (GMT-06:00) 
>> </div><div>To: TCLUG <tclug-list at mn-linux.org> </div><div>Subject: Re: 
>> [tclug-list] Simple Website Monitoring Tool </div><div>
>> </div>Yeah, I was going to use wget, but then I figured I may as well do it
>> "right" and use perl::LWP or somrthing. There are lots of options (:
>> 
>> On Mon, 22 Sep 2014, Brian Wall wrote:
>> 
>>> \On Mon, Sep 22, 2014 at 8:01 PM,  <tclug at freakzilla.com> wrote:
>>>> 
>>>> Ok, before I go write one myself, does anyone know of a simple website
>>>> uptime monitoring tool? Yeah, I can use Nagios but that's waaayyy 
>>>> overdone
>>>> and waaaaayyy overcomplicated.
>>> 
>>> You could probably use curl.  Feed it a URL and then parse the results
>>> to determine result (200, 404, 500,, etc).
>>> 
>>> Something to get you started:
>>> http://osric.com/chris/accidental-developer/2011/09/monitoring-web-server-status-with-a-shell-script/
>>> 
>>> Brian
>>> _______________________________________________
>>> TCLUG Mailing List - Minneapolis/St. Paul, Minnesota
>>> tclug-list at mn-linux.org
>>> http://mailman.mn-linux.org/mailman/listinfo/tclug-list
>>> 
>> _______________________________________________
>> TCLUG Mailing List - Minneapolis/St. Paul, Minnesota
>> tclug-list at mn-linux.org
>> http://mailman.mn-linux.org/mailman/listinfo/tclug-list
>
>
>
> _______________________________________________
> TCLUG Mailing List - Minneapolis/St. Paul, Minnesota
> tclug-list at mn-linux.org
> http://mailman.mn-linux.org/mailman/listinfo/tclug-list