Skip to main content

Displaying Details about Transaction Log files for Each Storage Group

Someone asked me about a script that would display details about the log files for each storage group on all the Exchange servers in their domain. While initially puzzled as to why this might be useful after a little thought the idea does bare a little fruit. Note the idea behind this is to look at the file details eg filename, date and size not the content of the file or to in anyway open or lock the files (which would be a bad thing). The general gist of the script is to enumerate the log file directory for each storage group and then count how many log files there are currently for that SG. While your counting them you can also add there sizes together to give you a figure of how much space you log files are taking up and you can also look at what the age of the oldest log file is this can let you know if your backups are working correctly and log files are being purged at the end of the backup cycle. If you start to think a little more laterally on this if where to snapshot this information at intervals during the day it can also become another Performance and usage indicator. Eg if you where to snap the size and number of your log files every hour you get could start to get a picture of the volume of transactions your mail database was processing what times it was busiest etc. Because transactions are more then just receiving and sending new mail this can act as a point of difference from other indicators you might look at (Remember performance monitoring is more then just measuring 1 dimensional metrics). You could even go as far as putting thresholds etc that might jink you to possible problems.

This script is generally based around the database file size script I posted here. I’ve done two versions of the script they both use an ADSI query to retrieve all the storage group objects within a domain and then connect to the servers based on the information that is parsed from some Ad properties on the storage group. The first script uses VBS’s FileSystemObject to connect to the administrative shares on a server and then uses the folder and files collection to read the file attributes. To make sure it only counts files that belong to this particular storage group the msExchESEParamBaseName property is used. This property contains the name prefix used for each storage groups log files. The rest of the script just counts the number of files, adds the size of the files together (and your always going to get something that dividable by 5120 KB if not start panicking). The second version uses WMI to do much the same thing this is useful if you have removed the admin shares on your server. One note on the WMI version is that if you have Daylight saving offsets then the file times might be out by this offset.

If put a downloadable copy of the code here the FSO version looks like

lfcount = 0
lfsize = 0
lfoldatenum = ""
set conn = createobject("ADODB.Connection")
set com = createobject("ADODB.Command")
Set iAdRootDSE = GetObject("LDAP://RootDSE")
strNameingContext = iAdRootDSE.Get("configurationNamingContext")
Conn.Provider = "ADsDSOObject"
Conn.Open "ADs Provider"
sgQuery = "<LDAP://" & strNameingContext & ">;(objectCategory=msExchStorageGroup);
name,distinguishedName,msExchESEParamBaseName,
adminDisplayName,msExchESEParamLogFilePath,msExchEDBFile;subtree"
Com.ActiveConnection = Conn
Com.CommandText = sgQuery
Set Rs = Com.Execute
Wscript.echo "Storeage Groups"
Wscript.echo
While Not Rs.EOF
slen = instr(rs.fields("distinguishedName"),"CN=InformationStore,") + 23
elen = instr(rs.fields("distinguishedName"),"CN=Servers,")-1
Set fso = CreateObject("Scripting.FileSystemObject")
logfileunc = "\\" & mid(rs.fields("distinguishedName"),slen,elen-slen) & "\" &
left(rs.fields("msExchESEParamLogFilePath").value,1) & "$" &
mid(rs.fields("msExchESEParamLogFilePath").value,3,len(rs.fields("msExchESEParamLogFilePath").value)-2)
set lfolder = fso.getfolder(logfileunc)
set lfiles = lfolder.files
for each lfile in lfiles
if left(lfile.name,3) = rs.fields("msExchESEParamBaseName").value and
right(lfile.name,3) = "log" then
lfcount = lfcount + 1
lfsize = lfsize + lfile.size
if lfcount = 1 then lfolddatenum = lfile.DateLastModified
if lfolddatenum > lfile.DateLastModified then
lfolddatenum = lfile.DateLastModified
end if
end if
next
wscript.echo "ServerName : " & mid(rs.fields("distinguishedName"),slen,elen-slen)

wscript.echo "Storage Group Name : " & rs.fields("adminDisplayName")
wscript.echo "Log file Path : " & rs.fields("msExchESEParamLogFilePath").value
wscript.echo "Log file Prefix : " & rs.fields("msExchESEParamBaseName").value
wscript.echo "Number of Log files in Directory : " & lfcount
wscript.echo "Disk Space being used : " & formatnumber(lfsize/1048576,2,0,0,0) &
" MB"
wscript.echo "Oldest Log file in this Directory : " & lfolddatenum
wscript.echo
lfcount = 0
lfsize = 0
lfolddatenum = ""
Rs.MoveNext
Wend
Rs.Close
Conn.Close
set mdbobj = Nothing
set pdbobj = Nothing
Set Rs = Nothing
Set Com = Nothing
Set Conn = Nothing
 

Popular posts from this blog

The MailboxConcurrency limit and using Batching in the Microsoft Graph API

If your getting an error such as Application is over its MailboxConcurrency limit while using the Microsoft Graph API this post may help you understand why. Background   The Mailbox  concurrency limit when your using the Graph API is 4 as per https://docs.microsoft.com/en-us/graph/throttling#outlook-service-limits . This is evaluated for each app ID and mailbox combination so this means you can have different apps running under the same credentials and the poor behavior of one won't cause the other to be throttled. If you compared that to EWS you could have up to 27 concurrent connections but they are shared across all apps on a first come first served basis. Batching Batching in the Graph API is a way of combining multiple requests into a single HTTP request. Batching in the Exchange Mail API's EWS and MAPI has been around for a long time and its common, for email Apps to process large numbers of smaller items for a variety of reasons.  Batching in the Graph is limited to a m

Sending a Message in Exchange Online via REST from an Arduino MKR1000

This is part 2 of my MKR1000 article, in this previous post  I looked at sending a Message via EWS using Basic Authentication.  In this Post I'll look at using the new Outlook REST API  which requires using OAuth authentication to get an Access Token. The prerequisites for this sketch are the same as in the other post with the addition of the ArduinoJson library  https://github.com/bblanchon/ArduinoJson  which is used to parse the Authentication Results to extract the Access Token. Also the SSL certificates for the login.windows.net  and outlook.office365.com need to be uploaded to the devices using the wifi101 Firmware updater. To use Token Authentication you need to register an Application in Azure https://msdn.microsoft.com/en-us/office/office365/howto/add-common-consent-manually  with the Mail.Send permission. The application should be a Native Client app that use the Out of Band Callback urn:ietf:wg:oauth:2.0:oob. You need to authorize it in you tenant (eg build a small ap

How to test SMTP using Opportunistic TLS with Powershell and grab the public certificate a SMTP server is using

Most email services these day employ Opportunistic TLS when trying to send Messages which means that wherever possible the Messages will be encrypted rather then the plain text legacy of SMTP.  This method was defined in RFC 3207 "SMTP Service Extension for Secure SMTP over Transport Layer Security" and  there's a quite a good explanation of Opportunistic TLS on Wikipedia  https://en.wikipedia.org/wiki/Opportunistic_TLS .  This is used for both Server to Server (eg MTA to MTA) and Client to server (Eg a Message client like Outlook which acts as a MSA) the later being generally Authenticated. Basically it allows you to have a normal plain text SMTP conversation that is then upgraded to TLS using the STARTTLS verb. Not all servers will support this verb so if its not supported then a message is just sent as Plain text. TLS relies on PKI certificates and the administrative issue s that come around certificate management like expired certificates which is why I wrote th
All sample scripts and source code is provided by for illustrative purposes only. All examples are untested in different environments and therefore, I cannot guarantee or imply reliability, serviceability, or function of these programs.

All code contained herein is provided to you "AS IS" without any warranties of any kind. The implied warranties of non-infringement, merchantability and fitness for a particular purpose are expressly disclaimed.