Use curl to get connect and download times for http requests

In order to debug a weird http connection issue regarding dns-lookups I had to collect some metrics first before doing any further configuration of my network equipment (or escalate to our admins).

Let’s determine where exactly the http request looses time. Without any additional monitoring tools we can issue a curl command via cron to fetch that information every few minutes and store the measurements into a csv file for further inspection.

I’m interest in the time it took to get the IP-address of the given url (dns lookup), followed by some intermediate times (connect to host, first byte, total).

Easy enough with a short bash script.

First the script itself. It must be executed via cron. It issues a http-GET to a given site and measures the timestamps for some metrics. Those metrics then get dumped into a csv which will be created if it does not exist.

Check your path’s before running on your machine.



# check if logfile exists, if not create it
if [ ! -f "${LOGFILE}" ]
    # create that file
    touch "${LOGFILE}"

    # add the first line / csv header
    echo "date, time, dns, connect, pretransfer, starttransfer, total" > "${LOGFILE}"

DATE=`date +"%Y-%m-%d"`
TIME=`date +"%H:%M:%S"`

CURL_OUTPUT=`/usr/bin/curl -m 60 -w "%{time_namelookup}, %{time_connect}, %{time_pretransfer}, %{time_starttransfer}, %{time_total}" -o /dev/null -s`

# log metrics into logfile, append that stuff
echo "${DATE}, ${TIME}, ${CURL_OUTPUT}" >> "${LOGFILE}"

Make that script executeable:

chmod +x

Register a cronjob that runs our script every 5 minutes:

crontab -e

# m h  dom mon dow   command
*/5  *  *  *  *   /home/croessler/tmp/ > /dev/null 2>&1