After I moved to WordPress Multisite recently, I had to closely watch the Apache error log file for a couple of days to make sure that there weren’t any configuration mismatch.
Initially I used run the tail
command to view the errors, but then I quickly realized that if there is a frequent error, then it becomes extremely difficult to find out other non frequent errors.
Command
After some fiddling with the awk
and sort
commands, I came up with the following one liner that prints the unique errors with their count.
Explanation
awk
script is executed for each line in the error log file.
Field Separator
The -F
directive is used to specify field separator. If you look at a single line in the apache error log file, you will notice that each field is enclosed by [
and ]
.
awk
allows us to specify multiple field separators by enclosing them between []
. That’s what we are specifying in -F[\[\]]
Filtering out lines that contain error
After splitting the fields, we need to filter out the lines that contain the term error
in the fourth field. That’s what happens in the next part of the command. Basically the part $4 == "error"{print $7}
prints the seventh field, which has the real error message if the fourth field is equal to the string “error”.
Sorting the error messages
The next step is to sort all the error messages. This is done by piping the output to the sort
command.
Finding unique error messages and counting them
The next step is to find the unique error messages and then count them. This is done by piping the output to the uniq
command. The -c
flag does the counting.
Sorting the messages by frequency
The last step is to sort the messages again by frequency and print them. This is done by the last sort
command. The -n
flag sorts by numbers and the flag -r
does the sorting by descending order.