filmov
tv
Unix & Linux: Health check of web page using curl (8 Solutions!!)

Показать описание
Unix & Linux: Health check of web page using curl
The Question: I'd like to do a health check of a service by calling a specific url on it.
Feels like the simplest solution would be to use cron to do the check every
minute or so. In case of errors, cron sends me an email.
I tried using cUrl for this but I can't get it to output messages only on
errors. If I try to direct output to /dev/null, it prints out progress report.
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 5559 100 5559 0 0 100k 0 --:--:-- --:--:-- --:--:-- 106k
I tried looking through the curl options but I just can't find anything to suit
the situation where you want it to be silent on success but make noise on
errors.
Is there a way to make curl do what I want or is there some other tool I should
be looking at?
Solutions: Please watch the whole video to see all solutions, in order of how many people found them helpful
== This solution helped 43 people ==
What about -sSf? From the man pages:
-s/--silent
Silent or quiet mode. Do not show progress meter or error messages.
Makes Curl mute.
-S/--show-error
When used with -s it makes curl show an error message if it fails.
-f/--fail
(HTTP) Fail silently (no output at all) on server errors. This is
mostly
done to better enable scripts etc to better deal with failed
attempts. In
normal cases when a HTTP server fails to deliver a document, it
returns
an HTML document stating so (which often also describes why and
more).
This flag will prevent curl from outputting that and return error 22.
This method is not fail-safe and there are occasions where non-
successful
response codes will slip through, especially when authentication
is
involved (response codes 401 and 407).
For example:
== This solution helped 14 people ==
I think that for the simplest way to check if the site is alive, you could use
the following method:
This will return HTTP/1.1 200 OK. If the return doesn't match your output then
call out for help.
== This solution helped 9 people ==
You need the -s flag (silent), -f flag (fail with exit code on error) and can
use the -o flag to redirect output:
This is just an bad example for a simple cron script. Normally, you want to get
only one mail if a website is down.
== This solution helped 1 person ==
I was recently asked to come up with something that would act more like a
sophisticated heartbeat.
uniq` ; do curl -s -I $i 2>/dev/null |head -n 1 | cut -d$' ' -f2; done
Or, expanded for a little more readability,
uniq)
do
curl -s -I "$i" 2>/dev/null | head -n 1 | cut -d' ' -f2
done
What I did was curl a website, parse out all the links from the html, and then
curl those parsed links, outputting only the status code. Then I would search
for http status codes >= 400 to find errors.
The Question: I'd like to do a health check of a service by calling a specific url on it.
Feels like the simplest solution would be to use cron to do the check every
minute or so. In case of errors, cron sends me an email.
I tried using cUrl for this but I can't get it to output messages only on
errors. If I try to direct output to /dev/null, it prints out progress report.
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 5559 100 5559 0 0 100k 0 --:--:-- --:--:-- --:--:-- 106k
I tried looking through the curl options but I just can't find anything to suit
the situation where you want it to be silent on success but make noise on
errors.
Is there a way to make curl do what I want or is there some other tool I should
be looking at?
Solutions: Please watch the whole video to see all solutions, in order of how many people found them helpful
== This solution helped 43 people ==
What about -sSf? From the man pages:
-s/--silent
Silent or quiet mode. Do not show progress meter or error messages.
Makes Curl mute.
-S/--show-error
When used with -s it makes curl show an error message if it fails.
-f/--fail
(HTTP) Fail silently (no output at all) on server errors. This is
mostly
done to better enable scripts etc to better deal with failed
attempts. In
normal cases when a HTTP server fails to deliver a document, it
returns
an HTML document stating so (which often also describes why and
more).
This flag will prevent curl from outputting that and return error 22.
This method is not fail-safe and there are occasions where non-
successful
response codes will slip through, especially when authentication
is
involved (response codes 401 and 407).
For example:
== This solution helped 14 people ==
I think that for the simplest way to check if the site is alive, you could use
the following method:
This will return HTTP/1.1 200 OK. If the return doesn't match your output then
call out for help.
== This solution helped 9 people ==
You need the -s flag (silent), -f flag (fail with exit code on error) and can
use the -o flag to redirect output:
This is just an bad example for a simple cron script. Normally, you want to get
only one mail if a website is down.
== This solution helped 1 person ==
I was recently asked to come up with something that would act more like a
sophisticated heartbeat.
uniq` ; do curl -s -I $i 2>/dev/null |head -n 1 | cut -d$' ' -f2; done
Or, expanded for a little more readability,
uniq)
do
curl -s -I "$i" 2>/dev/null | head -n 1 | cut -d' ' -f2
done
What I did was curl a website, parse out all the links from the html, and then
curl those parsed links, outputting only the status code. Then I would search
for http status codes >= 400 to find errors.