Now that SP2 is out in the wild and I’m starting to look more at what’s happening with SenderID (by the way if you haven't seen this already there is a great post on SenderID on the Exchange Team Blog). I wanted a way I could aggregate the information that is stored in the SMTP protocol logs so I could see for each domain that is sending me email what are the IP address’s of the mail servers and how many emails do have i recieved from each IP (and do this for a time period say the last 1-2 hours). I’ve had the beta of Monad which is the next version of the Windows Shell that will be in Vista (maybe) and E12 (downloadable for here) installed on my machine for a while and this seemed like a good task to take it out for a test drive. The main advantage of Monad from my point of view is being able to get access to all the objects in the .NET framework so this means you can finally get access to hashtables in your scripts (Perl users have had this for years). Hashtables are very versatile objects to use in scripts and are perfect for the sort of aggregation I wanted to do. Adam Barr has posted a very good example of using nested hash-tables that helped a lot with working out how to get this to work.
The script is relatively simple it takes two command-line parameters the first is the directory where the logs file are (can be network drive although be careful if you have really large log files) and the second parameter is the number of hours you want to look back. So the first couple of lines deal with inputting the parameters and next couple looks in the directory for any files that where modified within the time period imputed. The next part of the script does a line by line parse of the log file, the split method is used to break the log file into an array so each element can be processed as needed. Because I’m only interested in Inbound traffic there is an if statement to drop any outbound connections. And because I’m only interested in the “FROM” lines in the log file there’s some further if statements and also finally some code that does a time comparison so only the events within the inputted time period are processed. Because the time used in the log files is in UTC there’s some code that does the UTC conversion (this is a really cool compared to how you do this in VBS). The next part of script basically handles aggregating the domains into one hashtable and then creating a nested hashtable table to handle storing each of the IP address’s that are sending for that domain using a key derived from the IP-address’s and Domain name it also counts the number of email sent from each IP address. The last part to the script handles going back though the hashtable and displaying the data in a hierarchal format.
I’ve put a downloadable copy of the script here the script itself looks like the following
param([String] $LogDirectory = $(throw "Please specify path for a Log for Directory"),
[int32] $timerange = $(throw "Please specify a Time Range in Hours"))
$reqhash1 = @{ }
$Di = New-Object System.IO.DirectoryInfo $LogDirectory
foreach($fs in $Di.GetFileSystemInfos()){
if ($fs.LastWriteTime -gt [DateTime]::get_Now().AddHours(-$timerange) ){
foreach ($line in $(Get-Content $fs.Fullname)){
if ($line.Substring(0,1) -ne "#"){
$larry = $line.split(" ")
if ($larry[3] -ne "OutboundConnectionCommand"){
if ($larry[8] -eq "MAIL"){
$ltime = [System.Convert]::ToDateTime($larry[0] + " " + $larry[1])
if($ltime -gt [DateTime]::get_UtcNow().addhours(-$timerange)){
$femail = $larry[10].Substring($larry[10].IndexOf("<")+1,$larry[10].IndexOf(">")-$larry[10].IndexOf("<")-1)
$fdomain = $femail.Remove(0, $femail.IndexOf("@")+1)
if($reqhash1.ContainsKey($fdomain)){
$hashtabedit = $reqhash1[$fdomain]
if($hashtabedit.ContainsKey($larry[2] + "/" + $fdomain)){
$hashtabedit[$larry[2] + "/" + $fdomain] = $hashtabedit[$larry[2] + "/" + $fdomain] + 1
}
else{
$hashtabedit.Add($larry[2] + "/" + $fdomain,1)
}
}
else{
$reqhash2 = @{ }
$reqhash2.Add($larry[2] + "/" + $fdomain,1)
$reqhash1.Add($fdomain,$reqhash2)
}
}
}
}
}
}
}
}
foreach ($htent in $reqhash1.keys){
$htent
$reqhash2 = $reqhash1[$htent]
foreach ($htent1 in $reqhash2.keys){
" " + $htent1.Substring(0,$htent1.IndexOf("/")) + " " + $reqhash2[$htent1]
}
}
The script is relatively simple it takes two command-line parameters the first is the directory where the logs file are (can be network drive although be careful if you have really large log files) and the second parameter is the number of hours you want to look back. So the first couple of lines deal with inputting the parameters and next couple looks in the directory for any files that where modified within the time period imputed. The next part of the script does a line by line parse of the log file, the split method is used to break the log file into an array so each element can be processed as needed. Because I’m only interested in Inbound traffic there is an if statement to drop any outbound connections. And because I’m only interested in the “FROM” lines in the log file there’s some further if statements and also finally some code that does a time comparison so only the events within the inputted time period are processed. Because the time used in the log files is in UTC there’s some code that does the UTC conversion (this is a really cool compared to how you do this in VBS). The next part of script basically handles aggregating the domains into one hashtable and then creating a nested hashtable table to handle storing each of the IP address’s that are sending for that domain using a key derived from the IP-address’s and Domain name it also counts the number of email sent from each IP address. The last part to the script handles going back though the hashtable and displaying the data in a hierarchal format.
I’ve put a downloadable copy of the script here the script itself looks like the following
param([String] $LogDirectory = $(throw "Please specify path for a Log for Directory"),
[int32] $timerange = $(throw "Please specify a Time Range in Hours"))
$reqhash1 = @{ }
$Di = New-Object System.IO.DirectoryInfo $LogDirectory
foreach($fs in $Di.GetFileSystemInfos()){
if ($fs.LastWriteTime -gt [DateTime]::get_Now().AddHours(-$timerange) ){
foreach ($line in $(Get-Content $fs.Fullname)){
if ($line.Substring(0,1) -ne "#"){
$larry = $line.split(" ")
if ($larry[3] -ne "OutboundConnectionCommand"){
if ($larry[8] -eq "MAIL"){
$ltime = [System.Convert]::ToDateTime($larry[0] + " " + $larry[1])
if($ltime -gt [DateTime]::get_UtcNow().addhours(-$timerange)){
$femail = $larry[10].Substring($larry[10].IndexOf("<")+1,$larry[10].IndexOf(">")-$larry[10].IndexOf("<")-1)
$fdomain = $femail.Remove(0, $femail.IndexOf("@")+1)
if($reqhash1.ContainsKey($fdomain)){
$hashtabedit = $reqhash1[$fdomain]
if($hashtabedit.ContainsKey($larry[2] + "/" + $fdomain)){
$hashtabedit[$larry[2] + "/" + $fdomain] = $hashtabedit[$larry[2] + "/" + $fdomain] + 1
}
else{
$hashtabedit.Add($larry[2] + "/" + $fdomain,1)
}
}
else{
$reqhash2 = @{ }
$reqhash2.Add($larry[2] + "/" + $fdomain,1)
$reqhash1.Add($fdomain,$reqhash2)
}
}
}
}
}
}
}
}
foreach ($htent in $reqhash1.keys){
$htent
$reqhash2 = $reqhash1[$htent]
foreach ($htent1 in $reqhash2.keys){
" " + $htent1.Substring(0,$htent1.IndexOf("/")) + " " + $reqhash2[$htent1]
}
}