I operate a bunch of different sites and have done for many years now.
Some of them get quite a lot of traffic and require a high level of uptime.
To monitor the uptime of these sites, I use various tools to alert me when a site is unreachable, when a specific resource is returning an unexpected status code, or if one of the database heavy pages starts taking a while to respond to requests.
I have found that the last issue around heavy database-driven sites/pages has become the most problematic.
I have also found that while tweaking database commands to speed things up, performing an end-to-end request is much better than simply testing the SQL directly.
Creating a simple script
To do this I wrote a super simple Python script which shows me how a page is responding.
Make sure you have
requests installed. You can do this by running a
pip install requests or
pip3 install requests.
Let’s try our script out!
This really couldn’t be easier, and I get a view of peaks and troughs:
It’s quite primitive, but that’s just the point. You don’t always need to create some elaborate and costly solution when something simple will suffice.
Let’s make it better visually
We all love staring at the command-line, but I also love staring at graphs!
So why not throw in a bit of MatplotLib to the mix?
To get the
y coordinates of our graph, we also need to adjust the
test function a bit to return the current time as well as the time the request took to complete.
Then we loop through each request and append it’s responses to our temporary lists to build our graph.
To run this
python3 script, you will need to have installed
requests as well as
Once we run this new and updated script, it will produce a graph much like the following, even when running this script from our command-line:
This gives us a clearer picture of the overall performance, and it was dead simple too.