Category: Tools

Computer Forensics How-To: Microsoft Log Parser

As any incident responder will agree, you can never have too many logs. That is, of course, until you have to analyze them! I was recently on an engagement where our team had to review hundreds of gigabytes of logs looking for evidence of hacking activity. I was quickly reminded of how much I love Microsoft Log Parser.

Log Parser is often misunderstood and underestimated. It could possibly be the best forensic analysis tool ever devised. Imagine having the ability to take almost any chunk of data and quickly search it using SQL-based grammar. That’s Log Parser in a nutshell. It is a lightweight SQL-based search engine that operates on a staggering number of different input types (see Figure 1). Yes, I know that tools like Splunk and Sawmill are built around this same idea, but keep in mind that Log Parser was written in the year 2000. I am constantly amazed at the power it affords the forensic analyst, and you can’t beat the price (free). Save perhaps memory analysis, there isn’t much it can’t accomplish for an incident responder.

Figure 1:  Architecture Diagram from Log Parser DocumentationFigure 1: Architecture Diagram from Log Parser DocumentationIn my mind, two things have limited the use of Log Parser in the forensics community: the command-line requirement and the fear of SQL queries. Neither is much of an obstacle, and since this is a how-to, let’s debunk both.

Log Parser GUI

Log Parser’s command-line isn’t particularly onerous, but when staring at logs all day, I’m not afraid to admit that I prefer a GUI. There are several free options available, but I find Log Parser Lizard to be head and shoulders above the competition [1]. A few notable features of Log Parser Lizard:

  • Abstracts away from command line parameters allowing the user to focus solely on SQL queries
  • Allows column sorting, showing different views of the data without re-running the query (a big time saver when working with gigabytes of logs)
  • Includes an advanced grid option that gives Excel-like filtering capabilities and the ability to do Averages, Counts, Max, Min, and Sum equations on the fly
  • Simple interface for building charts
  • Tabbed results allows multiple queries to be run and compared
  • Contains a repository for saved queries, allowing you to organize your collection

I find the last feature to be especially helpful because every incident is different, and I frequently tweak queries. It is nice to be able to look through my archive or save a new one for future use. I use an “Examples” folder to save interesting solutions so I can refer back to them when building complicated searches.

Figure 2:  Saved Queries Organized by Log Parser LizardFigure 2: Saved Queries Organized by Log Parser Lizard

SQL Query Basics

The Internet is rife with excellent examples of Log Parser queries. I’ll cover a few here and provide some links to more comprehensive lists [2] [3] [4]. To really learn Log Parser I recommend grabbing some sample data, doing a Google search, and just playing with whatever queries strike your fancy. Like any computer language, there are multiple ways to achieve the same results, and taking the time to understand different queries is a quick way to learn the various functions and syntax. Do not be overwhelmed — you can create very powerful queries with a very limited Log Parser vocabulary. As an example, consider the following query:

EXTRACT_EXTENSION(cs-uri-stem) as Extension,
Count(*) as Total
FROM [IIS logs]
GROUP BY Extension

I often run this query because it gives me a quick view of the different file types that were requested from the web server. Breaking this down into its components, the SELECT clause tells Log Parser what elements of the log file we wish to display. Cs-uri-stem is an IIS log field that records the page requested from the web server [5]. The FROM clause tells Log Parser what the inputs will be. SELECT and FROM are the only required elements of a query. The GROUP BY clause is necessary when using an aggregate function, like “Count”, to give the total requests for each extension. Finally, the ORDER clause is optional but tells Log Parser to order the displayed results according to the value of Total in descending order (DESC).

Figure 3: Log Parser Output Showing File Extension Counts from IISFigure 3: Log Parser Output Showing File Extension Counts from IIS

The output in Figure 3 gives me a good starting point for my review. Knowing the multitude of CGI vulnerabilities that exist, I would certainly want to look deeper there. Similarly, I would also plan to investigate what .pl and .exe files are being accessed on the webserver. The next step is to run a follow-up query:

EXTRACT_EXTENSION(cs-uri-stem) as Extension,
sc-status as StatusCode,
Count(*) as Attempts
FROM [IIS logs]
WHERE Extension = ‘cgi’
GROUP BY Extension, StatusCode
ORDER by Attempts DESC

Figure 4: Log Parser Output Showing CGI Extensions by HTTP Status CodeFigure 4: Log Parser Output Showing CGI Extensions by HTTP Status CodeI added two items to this query. The first, sc-status, provides the HTTP status code for the request, indicating whether the web requests were successful (200s) or unsuccessful (typically 400s and 500s) [6]. The second addition is the WHERE clause, giving the ability to filter my results. In this case, I indicated I only wanted to see the count of status codes for files with a CGI extension. The WHERE clause is incredibly helpful for culling output and is the backbone of many Log Parser Queries. Looking at the results in Figure 4, I can see there were no successful requests for CGI files on this server. They were either not found (404) or the server refused to respond to the request (403).

A final action might be to take a look at some of the CGI queries to determine whether the errors were due to misconfigurations or nefarious activity. Since I want to see all fields from the logs related to CGI files, my query will be quite simple (* indicates all fields):

FROM [IIS logs]
WHERE EXTRACT_EXTENSION(cs-uri-stem) = ‘cgi’

Figure 5: Log Parser Output Listing Requests for CGI filesFigure 5: Log Parser Output Listing Requests for CGI filesA quick review of the results in Figure 5 shows requests for several suspicious CGI files as well as a browser user agent of “Nikto”. Based on this information, I can surmise that this web server was scanned using the Nikto vulnerability scanner on 10/13/10 at 1:03:28 UTC.

The key takeaway is that during a log review, you will be running multiple queries to cut across a massive amount of data. By slicing the data in different ways, you have a much better chance of finding anomalous or malicious activity than if you were to attempt to review the logs manually.

Figure 6: Parsing the RegistryFigure 6: Parsing the Registry

Using Log Parser to Query the Windows Registry

Log Parser has a myriad of uses other than just parsing text files. The Windows Registry is a great example of a very large binary file that Log Parser can natively search. Figure 6 shows an example of sorting the Registry by LastWriteTime. In this case, I asked Log Parser to return the Path, KeyName, ValueName, Value, and LastWriteTime of any Registry entry updated between 11/1/10 and 11/6/10 from the HKLM, HKCU, HKCC, HKCR, and HKU hives. This system was suspected of being compromised at the beginning of November, and we were looking for any changes precipitated by the intruders. Among other things, the results make it clear that WinSCP was installed on the system during that timeframe.

You might have noticed in my query that I specified a machine name, \HYDL56, for each hive. This notation allows querying of remote machines over the network. It is particularly useful if you are searching multiple systems for a specific indicator of compromise. Alternatively, I could have run the same query on the local machine by just specifying the hives of interest (HKLM, HKCU, …). This is a good example of when the command line version can be helpful, particularly when built into live response scripts.

Unfortunately I am not aware of any easy way to use Log Parser to query offline Registry files that we might pull from a forensic image. The current version of Log Parser does not accept offline Registry files as input. If you were truly motivated, you could extract data from the Registry hives in text form and pipe to Log Parser, but it would need to be a special case to be worth the effort.

Usage Tips

1. Start with high-level queries, and view your logs from many different perspectives

Reviewing HTTP status codes, looking for excessively long URI stems and queries, and searching for known bad keywords like “xp_cmdshell” are all excellent ways to identify SQL injection. By looking for the same attacks in different ways, you increase your chances of finding that needle in the haystack.

2. Use the principle of Least Frequency of Occurrence

Malicious activity on your system is by definition anomalous and will usually be some of the least frequent events on a system. Use Log Parser to trend activity such as hourly hits to critical .aspx pages and look for data that stands out. If you see thousands of 404 errors in your logs and only a few 403 errors, or a grouping of abnormal entries at 1AM on Saturday, those items might be worth investigating.

3. Request more data elements than you think you need

Often times a more in-depth investigation can be avoided with just a little more information. As an example, sometimes adding the web request query string (cs-uri-query) is much more helpful than just reviewing the page requested (cs-uri-stem) alone (Figure 7).

Figure 7: Extra Fields Can Make a Big DifferenceFigure 7: Extra Fields Can Make a Big Difference4. Get familiar with the built-in functions

Log Parser includes 80+ supporting functions that can greatly simplify queries. I used EXTRACT_EXTENSION in the examples above, and there are many others like EXTRACT_PATH, EXTRACT_FILENAME, STRLEN, TO_LOWERCASE, etc. [7]

5. Take advantage of the copious documentation available

I have only touched on a few of Log Parser’s capabilities. It can slice and dice Event Logs (both .EVT and .EVTX) with aplomb. You can perform complicated searches of a live file system, including using functions like HASHMD5_FILE to compare MD5 hashes. Remote systems can be queried and large scale searches of Active Directory objects can be performed. Once you learn the basics, its power is really only limited by your creativity. Log Parser installs with excellent documentation, and there is even an entire book on the subject [8].


[1] Log Parser Lizard. If you like the tool I recommend paying $10 for the “Pro” version to encourage future development!

[2] Forensic Log Parsing with Microsoft’s LogParser by Mark Burnett. This is an extremely good article covering incident response on IIS servers

[3] How To Analyze IIS logs with example SQL code. Numerous examples of SQL queries

[4] Dave Kleiman did an excellent post to the SANS blog showing how to use Log Parser for USB device information retrieval

[5] W3C IIS Fields

[6] HTTP Status Codes

[7] Log Parser Functions

[8] Microsoft Log Parser Toolkit book (Gabriele Giuseppini). Trying to cover even a fraction of Log Parser’s functionality in a blog post is daunting because the topic is much better suited to a technical reference. Giuseppini is the tool author and he and his co-authors do a superb job of teaching it using easy to follow examples.

Chad Tilbury, GCFA, has spent over ten years conducting computer crime investigations ranging from hacking to espionage to multi-million dollar fraud cases. He teaches FOR408 Windows Forensics and FOR508 Advanced Computer Forensic Analysis and Incident Response for the SANS Institute. Find him on Twitter @chadtilbury or at

+919840688822, +919003270444

Log Parser Rocks! More than 50 Examples!

Awesome article


Log Parser is a tool that has been around for quite some time (almost six years, in fact). I can’t really do any better than the description on the official download page, so here it is: “Log parser is a powerful, versatile tool that provides universal query access to text-based data such as log files, XML files and CSV files, as well as key data sources on the Windows operating system such as the Event Log, the Registry, the file system, and Active Directory”.

Log Parser is a command line (yes, command line!) tool that uses a SQL dialect to extract information from data sources. In particular, I have found it to be invaluable for extracting information from the web server logs of the sites that I manage and develop.

First, about that SQL syntax Log Parser uses to query the data sources… many developers seem to have a…

View original post 2,993 more words

Log Parser powerful log analyser in a easy way

I’ve recently been playing about with Request Filtering which is an interesting add-on for IIS7 onwards. I’ll blog about that soon, but I have had so much fun as a result of switching on Request Filtering I thought I’d write about some of the consequences first.

It allows you to set up rules, so certain requests are filtered out before they even get to your site, resulting in less load and a supposedly safer system. Lets say we want to filter out a URL or query string that contains the word “script”.

I won’t go into the mechanics of Request Filtering yet, but lets just say that the filter would return a 404'attacked')

To the attacker, 404 just means not found but in the IIS logs we can discover the subcode. Anyone that has taken a look into an IIS log knows that they can be hell. They’re long, text based and not really for human consumption.

Step Foward Log Parser Lizard

Microsoft released a command line tool way back in the mid 00’s called Log Parser that used SQL style syntax to query text based log files. Log Lizard takes this a step further by adding a really simple GUI.

It’s very cheap too. Download from here

What Logs Can It Read?
It can read text based log files such as IIS, HTTP, Event, SQL Error and lots more too. Some logs need a bit more effort on your part to configure the file (e.g SQL Logs) but once it works you are able to save your query for future use.

You can see form this image that there are actually loads of logs that it can use straight out of the box:

Types of log


404 Statuses

This IIS log has several 404 statuses with three different sub statuses. To find these manually in a large log would be near impossible. This is the beauty of Log lizard.It will take huge file(s) and find what you’re looking for really quickly.

Query to discover the frequency of the substatus

SELECT sc-status As Status,sc-substatus AS SubStatus,COUNT(sc-substatus) As SubStatusCount,CASE sc-substatus  
      WHEN 1 THEN 'Site Not Found'
      WHEN 2 THEN 'ISAPI or CGI restriction'
      WHEN 3 THEN 'MIME type restriction'
      WHEN 4 THEN 'No handler configured'
      WHEN 9 THEN 'File attribute hidden'
      WHEN 16 THEN 'DAV request sent to the static file handler'
      WHEN 17 THEN ' Dynamic content mapped to the static file handler via a wildcard MIME mapping'
      WHEN 20 THEN ' Too Many URL Segments' 
      --5-19 below are Request Filtering
      WHEN 5 THEN 'URL Sequence Denied'
      WHEN 6 THEN 'Verb Denied'
      WHEN 7 THEN 'File extension denied'
      WHEN 8 THEN 'Hidden Namespace'
      WHEN 10 THEN 'Request Header Too Long'
      WHEN 11 THEN 'URL Double Escaped'
      WHEN 12 THEN 'URL Has High Bit Chars'
      WHEN 13 THEN 'Content Length Too Large'
      WHEN 14 THEN 'URL Too Long'
      WHEN 15 THEN 'Query String Too Long'
      WHEN 18 THEN 'Query String Sequence Denied'
      WHEN 19 THEN 'Denied by Filtering Rule' 
     END AS SubStatusDescription
FROM #IISW3C# WHERE sc-status = 404 GROUP BY Status , SubStatus ORDER BY Status  


Results grid

Here we can see the sub-statuses, the count and the description. A full list of sub-statuses and their description can be found here

Other Queries

The following is a list of queries that have been gathered together from various sources, some I have written, most have been written by others, but all are useful. Give them ago and start to love parsing logs.

To get full use of the logs, ensure that all relevant fields are being saved into the log (e.g by default many fields are not saved). See here to configure logging

Bytes Per File Extension

SELECT EXTRACT_EXTENSION( cs-uri-stem ) AS Extension,  
MUL(PROPSUM(sc-bytes),100.0) AS PercentageOfBytes,  
Div(Sum(sc-bytes),1024) as AmountOfMbBytes  
GROUP BY Extension  
ORDER BY PercentageOfBytes DESC  

Win32 Error Codes and Description

SELECT sc-win32-status As Win32-Status,  
WIN32_ERROR_DESCRIPTION(sc-win32-status) as Description,  
COUNT(*) AS Hits  
WHERE Win32-Status<>0  
GROUP BY Win32-Status  
ORDER BY Win32-Status ASC  

Blocked File Extensions (404.7)

WHERE sc-status = 404 and sc-substatus = 7   --file extension denied  
ORDER BY Extension DESC  

Files Accessed Over the Web (HTTP 200)

SELECT DISTINCT TO_LOWERCASE(cs-uri-stem) AS URL, Count(*) AS Hits  
WHERE sc-status=200  

Overused Scripts by Day
A script (e.g. js or aspx) that is highly overused may be suspicious.

SELECT TO_STRING(TO_TIMESTAMP(date, time), 'yyyy-MM-dd') AS Day,  
cs-uri-stem, COUNT(*) AS Total  
WHERE (sc-status<400 or sc-status>=500) AND (TO_LOWERCASE(cs-uri-stem) LIKE '%.asp%'  
TO_LOWERCASE(cs-uri-stem) LIKE '%.exe%'  
TO_LOWERCASE(cs-uri-stem) LIKE '%.js%')  
GROUP BY Day, cs-uri-stem  
ORDER BY cs-uri-stem  

High Number of Hits to Single Page by One IP

--possible evidence of attacks 
SELECT DISTINCT date, cs-uri-stem, c-ip, Count(*) AS Hits  
GROUP BY date, c-ip, cs-uri-stem  
HAVING Hits>50  
ORDER BY Hits Desc  

Excessive Errors

--This may be a sign of attack
SELECT date, QUANTIZE(time, 3600) AS hour, sc-status,sc-substatus, Count(*) AS Errors  
WHERE sc-status>=400  
GROUP BY date, hour, sc-status,sc-substatus  
HAVING Errors>25  

Excessive HTTP 500

--This may be a sign that SQL injection is being attempted or exploited.
SELECT cs-uri-query, Count(*) AS Total  
WHERE sc-status>=500  
GROUP BY cs-uri-query  

Non Standard UserAgents

--Possible attack by automated tools or other hacker tools
SELECT DISTINCT cs(User-Agent)  
WHERE TO_LOWERCASE(cs(User-Agent)) NOT LIKE '%mozilla%' AND TO_LOWERCASE(cs(User-Agent)) NOT LIKE '%opera%'  
ORDER BY cs(User-Agent)  

Bytes Sent To Client

SELECT cs-uri-stem, Count(*) as Hits, AVG(sc-bytes) AS Avg, Max(sc-bytes) AS Max, Min(sc-bytes) AS Min, Sum(sc-bytes) AS Total  
WHERE TO_LOWERCASE(cs-uri-stem) LIKE '%.asp%'  
TO_LOWERCASE(cs-uri-stem) LIKE '%.exe%'  
GROUP BY cs-uri-stem  
ORDER BY cs-uri-stem  

HTTP Errors per Hour

SELECT date, QUANTIZE(time, 3600) AS Hour, sc-status AS Status, COUNT(*) AS Errors  
WHERE (sc-status >= 400)  
GROUP BY date, hour, sc-status  
HAVING (Errors > 25)  

HTTP errors ordered by URL and Status

SELECT cs-uri-stem AS Url, sc-status AS Status, COUNT(*) AS Errors  
WHERE (sc-status >= 400)  
GROUP BY Url, Status  

Page Hits By IP Address

SELECT c-ip, count(c-ip) as requestcount  
WHERE  cs-uri-stem like '%.aspx%'  
GROUP BY c-ip  
ORDER BY count(c-ip) desc  

Requests By IP Address

SELECT c-ip, count(c-ip) as requestcount  
GROUP BY c-ip  
ORDER BY count(c-ip) desc  

Top Browsers

SELECT to_int(mul(100.0,PropCount(*))) as Percent, count(*) as TotalHits, cs(User-Agent) as Browser  
GROUP BY Browser  
ORDER BY Totalhits desc  

Top Images – Hits, Time & bandwidth

SELECT Top 100 StrCat(Extract_Path(TO_Lowercase(cs-uri-stem)),'/') AS RequestedPath,  
Extract_filename(To_Lowercase(cs-uri-stem)) As RequestedFile, Count(*) AS Hits, Max(time-taken) As MaxTime, Avg(time-taken) As AvgTime, Max(sc-bytes) As BytesSent  
WHERE  (Extract_Extension(To_Lowercase(cs-uri-stem)) IN 'gif';'jpg';'png';'svg')) AND (sc-status = 200)  
GROUP BY To_Lowercase(cs-uri-stem)  
ORDER BY BytesSent, Hits, MaxTime DESC  

Broken Links on Sites That Refer To Your Site

SELECT DISTINCT cs(Referer) as Referer, cs-uri-stem as Url  
cs(Referer) IS NOT NULL AND  
sc-status = 404 AND  
(sc-substatus IS NULL OR sc-substatus=0)

Part 2 of this article is here



+919840688822, +919003270444