I’ve been revisiting the uses of this fantastic tool and I have renewed excitement about it. If you find yourself churning through log files routinely looking for specific errors or events this tool can save you an enormous amount of time.
Here is an example of how your can easily parse a text file:
Scenario 1: Parsing large text files for a specific text
A small background of the problem
Your customer experiences an “Access Denied” issue when they did something. You recommend that the customer use another brilliant tool that is named Filemon (http://technet.microsoft.com/en-us/sysinternals/bb896642.aspx) and reproduce the issue. If it is an issue with resource ACLs, the Filemon tool will be able to catch the error. Now, you ask the customer to send you the saved Filemon log file. Here comes the unfortunate part. You get the file (say, Filemon.log) but find that the size is huge (Filemon does log a lot of data!). Notepad will appear to hang and will be painfully slow to find the “Access Denied” lines in the log file. Microsoft Office Excel will refuse to open the file completely. Now what?
Answer: Open the Log Parser command window, and use the following command:
LOGPARSER “Select Text from C:\Filemon.log where Text like ‘%Access Denied%'” -i:TEXTLINE -q:Off
What we are telling the Log Parser tool is to parse through each line (Text) from the given file (C:\Filemon.log) where the line contains ‘Access Denied’. The -i:TEXTLINE command-line switch specifies the input format, and the -q:Off command-line switch tells it to be verbose (-q[:ON|OFF]:quiet mode;). If you turn the -q command-line switch on, the statistics shown and the field name (Text) in the output below will be absent.
7447 1:49:24 PM explorer.exe:1200 DIRECTORY C:\ Access Denied
Elements processed: 640444 Elements output: 1 Execution time: 12.75 seconds
How to avoid pressing ENTER multiple times if the number of records returned by your query is larger than 10?
Answer: Use the -rtp:-1 parameter in your queries!
This will be a necessary parameter in case you want to redirect the output into a file. Also, when you write to STDOUT, output records are displayed in batches made up of a number of rows equal to the value specified for this parameter. Once a batch of rows has been displayed, it will prompt the user to press a key to display the next batch of rows. Specifying “-1” for this parameter disables batching altogether!
Using query files
Another way to achieve the same results in a cleaner way is to create a query file. This way, you can easily tweak your query file and run it from the Log Parser tool’s command line. Apart from that, you can easily create a GUI according to your taste. The GUI loads the saved SQL query and runs the query by using the Log Parser tool.
If you want to achieve the same effect (as in Scenario 1) from SQL queries, you can provide the following command:
LOGPARSER -i:TEXTLINE file:C:\LPQ\SearchAnyTextfile.sql -q:off
C:\LPQ\SearchAnyTextFile.sql contains the following information:
Note Create a folder LPQ in your C:\ folder to use the samples shown in this column.
Select Text as LineFromFile FROM C:\Filemon.log WHERE Text LIKE '%Access Denied%'
If you notice, the query looks much cleaner now and makes more sense. This way, you can create more complex and larger queries as well, and everything will fit on your command line because you are using the .SQL file instead of the whole query. It is not possible to fit more than 260 characters on the command line anyways!
Keeping the benefits of using query files, I will use this method in the following scenarios. I have all my queries saved in C:\LPQ with a .sql extension (you can use your own).