Linux: Search logs using bash tools

By | July 26, 2016

Sometimes we need to find specific things in application vast logs, sometimes 10 or 20 rotated logs each hundred of MBs each.
For a quick search without using some specialized log viewer we can use the very powerful bash text processing commands.

Problem:
We have 10 log files each 100MB and we need to find out all the file names of the files that are processed in a certain stage.
We know that an entry of the log that reports the processing looks like (note this is a Websphere Application log):

We want to find a list of all the files from all the logs.

STEP 1: Find all the entries in all the files that match the above line
The easy way is to use find.

The result is a list of all the entries from all files.

The above will look in all the files from the current directory (.) and will search all the lines containing ‘MessageInputHandlerImpl: File processor started processing of message:’

STEP 2: Separate only the information we need after some separator.

We use cut -d that works as a string tokenizer. We change the expression to.

The result will be a list of sub-strings from the found entries at Step 1 but containing only the 16th token, where token separator is ‘ ‘.

We can continue further and for example determine all the business dates when the files were generated. Note that we know that the second token from the file name is the business date. We change the command by adding another tokenizer command in the pipe and the followed by sort (This command sorts a text stream or file forwards or backwards, or according to various keys or character positions.) and then uniq (This filter removes duplicate lines from a sorted file.)

The result will display a list of unique entries for each business date :

For reference and more details see Advanced Bash-Scripting Guide: Chapter 16. External Filters, Programs and Commands

Advertisements