{"id":606,"date":"2004-06-21T20:37:44","date_gmt":"2004-06-22T00:37:44","guid":{"rendered":"http:\/\/wordpress.cephas.net\/?p=606"},"modified":"2004-06-21T20:37:44","modified_gmt":"2004-06-22T00:37:44","slug":"microsoft-log-parser-in-action","status":"publish","type":"post","link":"https:\/\/cephas.net\/blog\/2004\/06\/21\/microsoft-log-parser-in-action\/","title":{"rendered":"Microsoft Log Parser in action"},"content":{"rendered":"<p>I mentioned <a href=\"http:\/\/www.microsoft.com\/downloads\/details.aspx?displaylang=en&amp;familyid=8cde4028-e247-45be-bab9-ac851fc166a4\">Microsoft Log Parser<\/a> a <a href=\"http:\/\/cephas.net\/blog\/2004\/03\/02\/web_server_log_parser.html\">couple months back<\/a> but never had a chance to actually use it until last Friday when my boss needed to know exactly how many times a certain type of file had been acccessed on our cluster of web servers since the beginning of the year.  We have <a href=\"http:\/\/www.netiq.com\/webtrends\/default.asp\">Webtrends<\/a>, but from what I&#8217;ve seen of it, it&#8217;s made for presenting a 30,000 foot view of a website, not for getting granular information about a specific URL or subset of a URL. In addition, WebTrends typically breaks down reports into weekly or monthly views, which again was not what I needed in this case.<\/p>\n<p>To make a long story short, after downloading and installing Log Parser, the command line argument to get what I needed into a CSV file (called result.txt in {installation dir}\\result.txt) was this:<br \/>\n<code><br \/>\n&gt; logparser \"select distinct cs-uri-stem, count(*) FROM D:\\logfiles\\*.log TO result.txt WHERE cs-uri-stem LIKE '\/images\/mydir\/%.swf' GROUP BY cs-uri-stem\" -i:IISW3C -o:CSV<br \/>\n<\/code><br \/>\nI&#8217;ll unzip that a bit.  &#8216;logparser&#8217; is executable you&#8217;re running from the command line; make sure that you CD to the directory where LogParser.exe lives (for me it was C:\\program files\\log parser\\LogParser.exe).  The second part is the SQL query: <\/p>\n<ul>\n<li>cs-uri-stem is one of the approximately 33 fields available in the IISW3C log file format,<\/li>\n<li>distinct and count() are just a couple of the many SQL functions that Log Parser supports<\/li>\n<li>D:\\logfiles\\*.log indicates the path to the log files that I want to query (and acts much like a database table as far as SQL goes<\/li>\n<li>TO result.txt is the endpoint to which I want to pipe the results, you can omit this and have the results printed directly to the command line, I needed the data piped to a file<\/li>\n<li>WHERE .. notice that Log Parser supports the LIKE keyword and also the GROUP BY keyword<\/li>\n<li>and finally the <code>-i<\/code> switch indicates the format of the log files I&#8217;m analzying and -o is the format that I&#8217;d like the results printed too.<\/li>\n<\/ul>\n<p>There were a couple of things that initially stumped me. First, it doesn&#8217;t appear (again from trial and error) that Log Parser can handle zipped log files, so I had to unzip all the log files, which could have caused a problem since a zipped log file in our environment is usually about 3MB and unzipped can get up to 200MB (this is per day&#8230; and I needed the last 6 months).  Luckily in this case I had enough disk space but next time I might not have enough room.  Second, it seemed that Log Parser attempted to guess at the format of the log file the first time I ran it, but on the second go around, it required that I specify the log file format using the <code>-i<\/code> command line switch. <\/p>\n<p>All said and done, I&#8217;d highly recommend that you add Log Parser to your tool belt.  I didn&#8217;t even mention that it can export to a variety of formats (W3C formatted text files, IIS log format text files, directly to a database, XML, CSV, or even your own custom log file format) or that it can be scripted using the included LogParser.dll. If  you&#8217;re curious, <a href=\"http:\/\/www.microsoft.com\/downloads\/details.aspx?displaylang=en&amp;familyid=8cde4028-e247-45be-bab9-ac851fc166a4\">download it now<\/a> and then dive into the included documentation (LogParser.doc).<\/p>\n","protected":false},"excerpt":{"rendered":"<p>I mentioned Microsoft Log Parser a couple months back but never had a chance to actually use it until last Friday when my boss needed to know exactly how many times a certain type of file had been acccessed on our cluster of web servers since the beginning of the year. We have Webtrends, but &hellip; <a href=\"https:\/\/cephas.net\/blog\/2004\/06\/21\/microsoft-log-parser-in-action\/\" class=\"more-link\">Continue reading <span class=\"screen-reader-text\">Microsoft Log Parser in action<\/span> <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[12],"tags":[],"_links":{"self":[{"href":"https:\/\/cephas.net\/blog\/wp-json\/wp\/v2\/posts\/606"}],"collection":[{"href":"https:\/\/cephas.net\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/cephas.net\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/cephas.net\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/cephas.net\/blog\/wp-json\/wp\/v2\/comments?post=606"}],"version-history":[{"count":0,"href":"https:\/\/cephas.net\/blog\/wp-json\/wp\/v2\/posts\/606\/revisions"}],"wp:attachment":[{"href":"https:\/\/cephas.net\/blog\/wp-json\/wp\/v2\/media?parent=606"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/cephas.net\/blog\/wp-json\/wp\/v2\/categories?post=606"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/cephas.net\/blog\/wp-json\/wp\/v2\/tags?post=606"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}