Log file searching using PowerShell

I recently had to do some IIS log file parsing to solve a performance problem and it turned out that PowerShell was a very useful tool for doing this. I thought it might be a good idea to share a few one-liners for the benefit of others (as well as myself the next time I need to do it).

Here we go:

Get-Item *.log | Where-Object { $_.LastWriteTime -gt "2011-09-01" } | Select-String 'foobar'

The above line retrieves a collection of files filtered so that only files with the “.log” extension is matched. It then checks that the last modification time is later than a given date and finally it selects and displays all content lines containing the string “foobar”.

To make it search for files recursively in sub-folder, switch the Get-Item cmdlet for something like Get-ChildItem . -recurse. This can look like this:

Get-ChildItem . -recurse -filter *.log | Select-String 'foobar' | Select-Object -property Path, Line

(Note that is that example we also limit the display to include the properties (file) Path and (matched) Line to make the result easier to read.)

If you want to analyze the results in detail it’s probably a good idea to output the results to a file:

Get-Item *.log | Where-Object { $_.LastWriteTime -gt "2011-09-01" } | Select-String 'foobar' | Add-Content foobarsearchresult.txt

Note that we use the Add-Content cmdlet instead of piping (using the > operator) in order to avoid newlines at the console width position in the file.

To display the number of matched lines in the files (assuming we created several search result files above), we can do things like this:

Get-Item *.txt | % { $_.Name; get-content $_ | measure-object }

The result will be similar to this:

Count    : 10
Average  :
Sum      :
Maximum  :
Minimum  :
Property :

Count    : 122
Average  :
Sum      :
Maximum  :
Minimum  :
Property :

Apparently we had 10 lines in the ‘searchresult_foo.txt’ file and 122 in ‘searchresult_bar.txt’. This was enough to find my problem (which happened to be an overambitious SharePoint search crawl, but that’s another story…).

Before I stopped, I couldn’t resist creating a slightly prettier line count report:

Get-Item  *.txt | % { Write-Host ("{0,-50} " -f $_.Name) -NoNewLine; get-content $_ | measure-object | % { "{0,10} lines" -f $_.Count } }

The result is now:

searchresult_foo.txt                                       10 lines
searchresult_bar.txt                                      122 lines

Besides giving prettier results, this script also prints the name of each file before actually counting the lines in it. This gives nice feedback to the user when the files are large and the counting takes time.

PowerShell is really becoming an increasingly important tool in my toolbox…


Detecting unhandled errors in ASP.Net applications

For my own reference, here’s how to detect unhandled exceptions in Global.asax:

protected void Application_Error(object sender, EventArgs e)
  // Log error
  Exception ex = Server.GetLastError().GetBaseException();
  string msg = String.Format("Ohanterat fel!\r\nAdress: {0}\r\nFelmeddelande: {1}", Request.Url, ex.Message);
  LogHelper.Error(msg, ex);


  // Display simple error page
  Response.Write("Ett oväntat fel har inträffat...\r\n");

We mostly do it like this, if you have a better suggestion, please post a comment 🙂