Thursday, October 17, 2019

Doing Mailbox Change discovery with an EWS PowerShell Script

Mailbox Change discovery is the process of looking at any folders or items that are new or have been modified recently in a Mailbox. Its useful in a number of different ways including (but not limited to)

  • Looking at what objects a third party Addin is creating or modifying in your mailbox
  • Help to work out which FAI (Folder Associated Item) is being modified when changes are made to the configuration in Outlook or Outlook on the Web (this can be useful if you then want to automate those changes in your own scripts)
  • Fixing client issues caused by corrupt or bad items (eg if you've ever used MFCMapi to delete and Item that's causing a particular client function not to work correctly)
  • Getting an understanding of how the backend scaffolding of new features work in Outlook on the Web (eg looking at what the substrate Is doing in Office365) 
If you have ever looked recently at the Non_IPM Root folder of any Office365 Mailbox you can see by the large number of folders that are used by various different apps, substrate processes as well as for new client features there is a lot going on. So this script can also help give a bit of insight on what's happening in the background when you activate or use particular features (or potentially point you to the location in the Mailbox when your looking at problems that might be occurring with certain features)
I'll go through a specific use case later looking at the "contact favourite feature (which I struggled to even find the UI documentation for)" which is what prompted me to write this script.

What this script does

The script has three main functions

  1. Enumerates every folder in the Mailbox (both IPM and NON_IPM_Subtree) as well as Search Folders and looks at the created and modified date of each folder. If they where created or modified in the lookbacktime then it's adds them to the report
  2. It then does a Scan of the Items in each Folder (excluding Search Folders) and if it finds any items that where modified or created after the lookbacktime then it adds these to the report
  3. It then does a Scan of the FAI Items (Folder Associated Items) in the Folder and again if the items where modified or created after the lookbacktime then it adds these to the report
The Output of the Report then contains information about what folders, Items and FAI Items have either been created or modified in the Mailbox in the last x number of seconds.

An Example

The best way to demonstrate this is with an Example which was the reason I wrote the script, The Contact Favourite feature in Outlook on the Web gives you the ability to click the star next to a Contacts name in OWA which then creates a favourite Shortcut eg


So I wanted to know when you did this where does the favourite item get created, what information it was storing and what other changes where happening. So this is where the following script comes into handy to find this information out all I needed to do was favourite a contact and then run the script immediately after to look at the items which changed in the Mailbox in the last 60 seconds. Eg a run of the script after I made the above change yielded a report that looked like



So from the above report you can see that firstly a new Search Folder was created 
\FavoritePersonas\Glen Scales_7d09f835-0028-4bd7-bed9-59535127bbe1 New SearchFolder



under the \FavoritePersonas\ directory and also an object of type SDS.32d4b5e5-7d33-4e7f-b073-f8cffbbb47a1.OutlookFavoriteItem was created under the folder

\ApplicationDataRoot\32d4b5e5-7d33-4e7f-b073-f8cffbbb47a1\outlookfavorites

The other thing that I included in the report was the EntryId of the Item found so in the above case i can take the EntryId for the outlookfavourite and open the Item in a Mapi editor like OutlookSpy or MFCMAPI eg


And you can then see all the MAPI properties on the Item (or delete/export etc)



That's it relatively simple to use eg

Invoke-MailboxChangeDiscovery -MailboxName mailbox@domain -secondstolookback 60

I've put this script up on GitHub here https://github.com/gscales/Powershell-Scripts/blob/master/ChangeDiscovery.ps1

Friday, October 04, 2019

Using the MSAL (Microsoft Authentication Library) in EWS with Office365

Last July Microsoft announced here they would be disabling basic authentication in EWS on October 13 2020 which is now a little over a year away. Given the amount of time that has passed since the announcement any line of business applications or third party applications that you use that had been using Basic authentication should have been modified or upgraded to support using oAuth. If this isn't the case the time to take action is now.

When you need to migrate a .NET app or script you have using EWS and basic Authentication you have two Authentication libraries you can choose from

  1. ADAL - Azure AD Authentication Library (uses the v1 Azure AD Endpoint)
  2. MSAL - Microsoft Authentication Library (uses the v2 Microsoft Identity Platform Endpoint)
the most common library you will come across in use is the ADAL libraries because its been around the longest, has good support across a number of languages and allows complex authentications scenarios with support for SAML etc. The MSAL is the latest and greatest in terms of its support for oAuth2 standards and is where Microsoft are investing their future development efforts. A good primer for understanding the difference in terms of the Tokens that both of these endpoint generate is to read https://docs.microsoft.com/en-us/azure/active-directory/develop/access-tokens

So which should you choose ? If your using PowerShell then the ADAL is the easiest to use and there are a lot of good examples for this like. However from a long term point of view using MSAL library can be a better choice as its going to offer more supportability (new features etc) going forward as long as you don't fall into one of  the restrictions described in https://github.com/AzureAD/microsoft-authentication-library-for-dotnet/wiki/Adal-to-Msal

In this post I'm going to look at using the MSAL library with EWS to access Mailboxes in Exchange Online. 

Scopes

One of the biggest differences when it comes to coding between the libraries with ADAL you specify the resource your going to use eg "https://outlook.office365.com" and with the MASL you specific the scopes you are going to use. With EWS its relatively simple in that  there are only two scopes (EWS doesn't allow you to constrain your access to different mailbox item types) which you would first need to allow in your Application registration which can be found in the Supported Legacy API's section of the application registration(make sure you scroll right to the bottom)

Delegated Permissions


Application Permissions (where you going to use AppOnly Tokens)

.default Scope

For v.1 apps you can get all the static scopes configured in an application using the .default scope so for ews that would look something like https://outlook.office365.com/.default . When your using App Only tokens this becomes important.


App registration and Consent

One of the advantages of the MSAL library is dynamic consent which for EWS because in practice your only going to be using one scope it doesn't have much use. However if your also going to be using other workloads you maybe able to take advantage of that feature. For the app registration you need to use the v2 Endpoint registration process (which is the default now in the Azure portal) see https://docs.microsoft.com/en-us/graph/auth-register-app-v2. This also makes it easy to handle the consent within a tenant.

Getting down to coding

In the ADAL there was only one single class called the AuthenticationContext which you used to request tokens. In the MSAL you have the PublicClientApplication (which you use for standard user authentication) and ConfidentialClientApp which gets used for AppOnly tokens and On-Behalf-Of flow.

Endpoints 

With the v2 Endpoint you have the option of allowing
  1. common
  2. organizations
  3. consumers
  4. Tenant specific (Guid or Name)
For EWS you generally always want to use the Tenant specific endpoint which means its best to either dynamically get the TenantId for the tenant your targeting or hard code it . eg you can get the TenantId need with 3 lines of C#


string domainName = "datarumble.com";
HttpClient Client = new HttpClient();
var TenantId = ((dynamic)JsonConvert.DeserializeObject(Client.GetAsync("https://login.microsoftonline.com/" + domainName + "/v2.0/.well-known/openid-configuration")
     .Result.Content.ReadAsStringAsync().Result))
    .authorization_endpoint.ToString().Split('/')[3];
In PowerShell you can do it with

$TenantId = (Invoke-WebRequest https://login.windows.net/datarumble.com/v2.0/.well-known/openid-configuration | ConvertFrom-Json).token_endpoint.Split('/')[3]

Delegate Authentication in EWS with MSAL and the EWS Managed API

This generally is the most common way of using EWS where your authenticating as a standard User and then accessing a Mailbox. If its a shared Mailbox then access will need to be have granted via Add-MailboxFolderPermission or you using EWS Impersonation

This is the simplest C# example of an Auth using the MSAL library in a Console app to logon (the currently logged on user).

 string MailboxName = "gscales@datarumble.com";
 string scope = "https://outlook.office.com/EWS.AccessAsUser.All";
 string redirectUri = "msal9d5d77a6-fe09-473e-8931-958f15f1a96b://auth";
 string domainName = "datarumble.com";

 HttpClient Client = new HttpClient();
 var TenantId = ((dynamic)JsonConvert.DeserializeObject(Client.GetAsync("https://login.microsoftonline.com/" + domainName + "/v2.0/.well-known/openid-configuration")
  .Result.Content.ReadAsStringAsync().Result))
  .authorization_endpoint.ToString().Split('/')[3];

 PublicClientApplicationBuilder pcaConfig = PublicClientApplicationBuilder.Create("9d5d77a6-fe09-473e-8931-958f15f1a96b")      
  .WithTenantId(TenantId);
   
 pcaConfig.WithRedirectUri(redirectUri);
 var TokenResult = pcaConfig.Build().AcquireTokenInteractive(new[] { scope })
  .WithPrompt(Prompt.Never)
  .WithLoginHint(MailboxName).ExecuteAsync().Result;

 ExchangeService service = new ExchangeService(ExchangeVersion.Exchange2016);
 service.Url = new Uri("https://outlook.office365.com/ews/exchange.asmx");
 service.Credentials = new OAuthCredentials(TokenResult.AccessToken);
 service.HttpHeaders.Add("X-AnchorMailbox", MailboxName);

 Folder Inbox = Folder.Bind(service, WellKnownFolderName.Inbox);


AppOnly Tokens

This is where your Application is authenticating using a App Secret or SSL certificate, after this your App will get full access to all Mailboxes in a tenant (it important to not that the scoping feature https://docs.microsoft.com/en-us/graph/auth-limit-mailbox-access doesn't work with the EWS so you need to be using the Graph or Outlook api).

 string clientId = "9d5d77a6-fe09-473e-8931-958f15f1a96b";
 string clientSecret = "xxxx";
 string mailboxName = "gscales@datarumble.com";
 string redirectUri = "msal9d5d77a6-fe09-473e-8931-958f15f1a96b://auth";
 string domainName = "datarumble.com";
 string scope = "https://outlook.office365.com/.default";

 HttpClient Client = new HttpClient();
 var TenantId = ((dynamic)JsonConvert.DeserializeObject(Client.GetAsync("https://login.microsoftonline.com/" + domainName + "/v2.0/.well-known/openid-configuration")
  .Result.Content.ReadAsStringAsync().Result))
  .authorization_endpoint.ToString().Split('/')[3];

 IConfidentialClientApplication app = ConfidentialClientApplicationBuilder.Create(clientId)
  .WithClientSecret(clientSecret)
  .WithTenantId(TenantId)
  .WithRedirectUri(redirectUri)
  .Build();

  
 var TokenResult = app.AcquireTokenForClient(new[] { scope }).ExecuteAsync().Result;
 ExchangeService service = new ExchangeService(ExchangeVersion.Exchange2016);
 service.Url = new Uri("https://outlook.office365.com/ews/exchange.asmx");
 service.Credentials = new OAuthCredentials(TokenResult.AccessToken);
 service.HttpHeaders.Add("X-AnchorMailbox", mailboxName);
 service.ImpersonatedUserId = new ImpersonatedUserId(ConnectingIdType.SmtpAddress, mailboxName);
 Folder Inbox = Folder.Bind(service, new FolderId(WellKnownFolderName.Inbox, mailboxName));


Token Refresh

One of the big things missing in the EWS Managed API is a callback before each request that checks for an expired Access Token. Because tokens are only valid for 1 hour if you have a long running process like a migration/export or data analysis then you need to make sure that you have some provision in your code to track the expiry of the access token and the refresh the token when needed.

Doing this in PowerShell

If your using PowerShell you can use the same code as above as long as import the MSAL library dll into your session

Some simple auth examples for this would be

 Delegate Authentication 


$MailboxName = "gscales@datarumble.com";
$ClientId = "9d5d77a6-fe09-473e-8931-958f15f1a96b"
$scope = "https://outlook.office.com/EWS.AccessAsUser.All";
$redirectUri = "msal9d5d77a6-fe09-473e-8931-958f15f1a96b://auth";
$domainName = "datarumble.com";
$Scopes = New-Object System.Collections.Generic.List[string]
$Scopes.Add($Scope)
$TenantId = (Invoke-WebRequest https://login.windows.net/datarumble.com/v2.0/.well-known/openid-configuration | ConvertFrom-Json).token_endpoint.Split('/')[3]
$pcaConfig = [Microsoft.Identity.Client.PublicClientApplicationBuilder]::Create($ClientId).WithTenantId($TenantId).WithRedirectUri($redirectUri)
$TokenResult = $pcaConfig.Build().AcquireTokenInteractive($Scopes).WithPrompt([Microsoft.Identity.Client.Prompt]::Never).WithLoginHint($MailboxName).ExecuteAsync().Result;

AppOnly Token


$ClientId = "9d5d77a6-fe09-473e-8931-958f15f1a96b"
$MailboxName = "gscales@datarumble.com"
$RedirectUri = "msal9d5d77a6-fe09-473e-8931-958f15f1a96b://auth"
$ClientSecret = "xxx";
$Scope = "https://outlook.office365.com/.default"
$TenantId = (Invoke-WebRequest https://login.windows.net/datarumble.com/v2.0/.well-known/openid-configuration | ConvertFrom-Json).token_endpoint.Split('/')[3]
$app =  [Microsoft.Identity.Client.ConfidentialClientApplicationBuilder]::Create($ClientId).WithClientSecret($ClientSecret).WithTenantId($TenantId).WithRedirectUri($RedirectUri).Build()
$Scopes = New-Object System.Collections.Generic.List[string]
$Scopes.Add($Scope)
$TokenResult = $app.AcquireTokenForClient($Scopes).ExecuteAsync().Result;





Friday, September 20, 2019

Using a System-assigned managed identity in an Azure VM with an Azure Key Vault to secure an AppOnly Certificate in a Microsoft Graph or EWS PowerShell Script

One common and long standing security issue around automation is the physical storage of the credentials your script needs to get, whatever task your trying to automate done. The worse thing you can do from a security point of view is store them as plain text in the script (and there are still plenty of people out there doing this) a better option is to do some encryption (making sure you use the Windows Data Protection API) eg https://practical365.com/blog/saving-credentials-for-office-365-powershell-scripts-and-scheduled-tasks/ . Azure also offers some better options with ability to secure the credentials and certificates in RunBooks, so it is just a few clicks in the Gui and some simple code to secure your credentials when using a RunBooks.

In this post I’m going to look at the issues around storing and accessing SSL certificates associated with App only token authentication that you might be looking to use in Automation scripts. This is  more for when you can’t take advantage of Azure Runbooks and need the flexibility of a VM.

In Exchange (and Exchange Online) EWS Impersonation has been around for quite some time, it offers the ability to have a set of credentials that can impersonate any Mailbox owner. With the Microsoft Graph you have App Only tokens which offers a similar type of access with the additional functionality to limit the scope greatly to certain mailboxes and Item types . With App Only tokens you don’t have a username/password combination but a SSL certificate or application secret (the later should be avoided in production). So instead of the concern around the physical security of the username and password combination, your concern now is around the security of the underlying certificate.

One of the most important points is the time to start thinking about the security of the certificate is before you generate it. Eg just having a developer or Ops person generate it on their workstation leaving copies of the certificate file anywhere is the equivalent of the postit note on the monitor. This is where the Azure KeyVault (or AWS KMS) can be used to secure both the creation of the certificate and provide the ongoing storage and importantly Access Control and Auditing. So from the point of the creation of the AppOnly cert you should be able to have an audit trail of who created it and who accessed it. The other advantage of having the cert in a KeyVault is that it also makes it easy for you to have a short expiry on the certificate and automate the renewal process which in tern makes your auth process more secure.

Once the authentication Certificate is stored in the KeyVault then the weakest link can be the authentication method you then use to access the KeyVault. Eg a user account can be granted rights to the KeyVault (which then make that user account the weakest link) or you could use another Application secret or SSL Certificate to secure access to the data plain of the KeyVault. At some point all of these become self defeating from a security point of view as your still storing a credential (especially if you then store that as plain text) and for a persistent threat actor this still leaves you exposed.

One way of getting rid of the storage of credentials is the use of Managed identities where (“in my mind at least”) your trusting the infrastructure where the code is running. The simplest example would be say you create an Azure compute function and give that function a Managed identity, you can then grant that access to the KeyVault and your function can now access the certificate from the KeyVault and then authenticate to Azure and access every Mailbox in your tenant. So now you have placed the trust in the code in the function and the underlying security of the function (eg can the code be exploited, could somebody hack the deployment method being used and replace the code with their own Etc). With a System-assigned managed identity in an Azure VM your doing the same thing but this time the point of trust is the Azure Virtual Machine. So now its down to the physical security measures around the Azure VM which becomes the weakest link. Still not infallible but your security options around securing VM are many, so with good planning and practice you should be able get a balance between flexibility and security.
So lets look at a simple implementation of using a System-assigned managed identity in a Azure VM in a Powershell Script to get a SSL Certificate from a KeyVault and then access the Microsoft Graph using an AppOnly token generated using that certificate.

  1. This first thing you need is an Azure Key Vault where you have enabled auditing https://docs.microsoft.com/en-us/azure/key-vault/key-vault-logging
  2. You need to create an application registration in Azure AD for your app that will be using the SSL cert from the Keystore to generate a AppOnly token and then use Application permissions for the task you want it to perform. https://docs.microsoft.com/en-us/graph/auth-register-app-v2
  3. Make sure you then consent to the above application registration so it can be used in your tenant (The portal now make this very straight forward)
  4. You need an Azure VM where you have enabled System-assigned managed identity https://docs.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/qs-configure-portal-windows-vm
  5. One your Azure VM has a System-assigned managed identity you should be able to grant that secret-permissions so it can access the SSL certificate we are going to store in the KeyVault eg 
  6. Next step is create the Self Signed in the Azure Key Vault using the Azure Portal
  7. At this step you should now be able to access the Self Signed certificate from the Key Vault on your VM with some simple PowerShell code. All you will need is the URL to Key Secret Identifier eg

Them some code like

$KeyVaultURL = "https://xxx.vault.azure.net/secrets/App1AuthCert/xxx99c5d054f43698f39c51f24440xxx?api-version=7.0"
$SptokenResult = Invoke-WebRequest -Uri 'http://169.254.169.254/metadata/identity/oauth2/token?api-version=2018-02-01&resource=https%3A%2F%2Fvault.azure.net' -Headers @{Metadata="true"}
$Sptoken = ConvertFrom-Json $SptokenResult.Content
$headers = @{
'Content-Type'  = 'application\json'
'Authorization' = 'Bearer ' + $Sptoken.access_token    
}

$Response = (Invoke-WebRequest -Uri $KeyVaultURL  -Headers $headers) 
$certResponse = ConvertFrom-Json $Response.Content
With the above example you’ll see the following hard-coded URI

http://169.254.169.254/metadata/identity/oauth2/token

This isn’t something you need to change as 169.254.169.254 is

The endpoint is available at a well-known non-routable IP address (169.254.169.254) that can be accessed only from within the VM
https://docs.microsoft.com/en-us/azure/virtual-machines/windows/instance-metadata-service

So the above script uses this local Metadata Service to acquire the access token to access the Azure KeyVault (as the System-assigned managed identity). One you have the certificate raw data from the KeyVault you can then load it into a Typed Certificate Object eg

$base64Value = $certResponse.value
$Certificate = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2
$Certificate.Import([System.Convert]::FromBase64String($base64Value))
The last step is the certificate must be uploaded to the Application registration created in step 2 or added to the application manifest either manually or pro-grammatically. eg the following is an example to produces the required manifest format described here https://docs.microsoft.com/en-us/azure/active-directory/develop/active-directory-certificate-credentials

   
    $SptokenResult = Invoke-WebRequest -Uri 'http://169.254.169.254/metadata/identity/oauth2/token?api-version=2018-02-01&resource=https%3A%2F%2Fvault.azure.net' -Headers @{Metadata="true"}
    $Sptoken = ConvertFrom-Json $SptokenResult.Content
    $KeyVaultURL = "https://gspskeys.vault.azure.net/secrets/App1AuthCert/xxx99c5d054f43698f39c51f24440xxx?api-version=7.0"
    $headers = @{
        'Content-Type'  = 'application\json'
        'Authorization' = 'Bearer ' + $Sptoken.access_token    
    }
    $Response = (Invoke-WebRequest -Uri $KeyVaultURL  -Headers $headers) 
    $certResponse = ConvertFrom-Json $Response.Content
    $base64Value = $certResponse.value
    $Certificate = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2
    $Certificate.Import([System.Convert]::FromBase64String($base64Value))
    $bin = $Certificate.GetCertHash()
    $base64Thumbprint = [System.Convert]::ToBase64String($bin)
    $keyid = [System.Guid]::NewGuid().ToString()
    $jsonObj = @{ customKeyIdentifier = $base64Thumbprint; keyId = $keyid; type = "AsymmetricX509Cert"; usage = "Verify"; value = $base64Value }
    $keyCredentials = ConvertTo-Json @($jsonObj) | Out-File "c:\temp\tmp.key"

This puts the certificate data into a file temporarily which isn’t great you can actually use the Graph API to create an app registration and add the cert data directly which means the cert data never needs to be export/import into a file.

Authenticating with the SSL Certifcate you retrieved from the KeyVault

Once you have the Certificate loaded you can then use the ADAL library to perform the authentication and get the AppOnly access token you can either use in the Microsoft Graph or EWS eg

$ClientId = "12d09d34-c3a3-49fc-bdf7-e059801801ae"
$MailboxName = "gscales@datarumble.com"  
Import-Module .\Microsoft.IdentityModel.Clients.ActiveDirectory.dll -Force
$TenantId = (Invoke-WebRequest -Uri ('https://login.windows.net/' + $MailboxName.Split('@')[1] + '/.well-known/openid-configuration') | ConvertFrom-Json).authorization_endpoint.Split('/')[3]

The ClientId is the ClientId from the application registration in step 2 the following does the Authentication using the configuration information from the above and then makes a simple Graph
request that uses the App Only Access Token that is returned.
$Context = New-Object Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext("https://login.microsoftonline.com/" + $TenantId)
$clientCredential = New-Object Microsoft.IdentityModel.Clients.ActiveDirectory.ClientAssertionCertificate($ClientId,$Certificate)
$token = ($Context.AcquireTokenAsync("https://graph.microsoft.com", $clientCredential).Result)
$Header = @{
   'Content-Type'  = 'application\json'
   'Authorization' = $token.CreateAuthorizationHeader()
}
$UserResult = (Invoke-RestMethod -Headers $Header -Uri ("https://graph.microsoft.com/v1.0/users?`$filter=mail eq '" + $MailboxName + "'&`$Select=displayName,businessPhones,mobilePhone,mail,jobTitle,companyName") -Method Get -ContentType "Application/json").value
return $UserResult

A Gist of the full script can be found here

Wednesday, August 07, 2019

Email Header IpAddress GeoIP report Addin for Outlook and Outlook on the Web in Office365

Something that can be useful from time to time when looking at email delivery issues or email threats is to be able to see the Geographical regions that an email has traversed in its delivery. Usually this information gets stored in the Email Header in the received headers but also depending on the client and services being used the Source IpAddress of the client and other intermediaries may get written in other properties.

Because I needed something last week to do this and couldn't find any other addins to do this I created a pretty simple Outlook addin that

  • Gets the headers from a Message using the REST API in Office365
  • Uses a RegEx to get all the IPAddresses from that header
  • Uses a Set in JavaScript to then de duplicate these IPAddresses
  • Then I used one of the many free GeoIP web services out there to query each of the returned IPAddresses from the Regex matches and finally display the result in a table but to Outlook
For example here is what it returns where run against a normal gray email that was forward to my Office365 Mailbox from Gmail

For a quick diagnostic this information is pretty useful as it tells you where the email traversed and the Org information generally will tell you the cloud providers being used. Doing this on email in my junk email folder that where of a nefarious nature showed that the these emails had transitied through  countries that are the usual suspects in this type of activity.

I've hosted the code up on GitHub here https://github.com/gscales/gscales.github.io/tree/master/MailGeoRoutes so the addin can be added straight from the Repo if you want to try it using (MyAddins - Custom Addins) and

https://gscales.github.io/MailGeoRoutes/MailGeoRoutes.xml

This uses REST to get the header so will only work on Office365 but the same thing could also be done using EWS for OnPrem servers from Exchange 2013 onward.


Wednesday, July 24, 2019

How to enable Dark mode in Outlook on Web in Office365 with EWS and PowerShell

Last year at Ignite Microsoft announced Dark mode for Outlook On the Web, while this seem to excite a lot of people I never really caught the buzz. However after taking the plunge after being notification bugged by Outlook this week I've found it to be a nice addition especially if your eyes aren't 100%.

When you enable Dark mode using the slider in Outlook on the Web

 
This changes/creates a setting called "isDarkModeTheme" in the OWA.UserOptions User Configuration Object which is held in the FAI collection (Folder Associated Items) in the Non_IPM_Root of the Mailbox. If you want to enable this setting for a user (or users) programmatically or just want to take stock of who is using this then you can use EWS to Read and Set the value in the OWA.UserOptions User Configuration Object in a Mailbox. (if you want to do this in the Microsoft Graph you will need to cry into your beer at the moment because the Microsoft Graph still doesn't support either user configuration objects or accessing FAI Items 😭😭😭). 

The code to enable dark mode is pretty easy first you need the FolderId for the Non_IPM_Root folder of the Mailbox you want to work with, then bind to the UserConfiguration object which will return the Dictionary from the underlying PR_ROAMING_DICTIONARY property. If Dark mode hasn't been enabled yet then the property shouldn't yet be in the Dictionary but if its is it will either be set to True of False depending on wether its enabled or not. So to Change this all we need is some simple code like the following 

        $folderid= new-object Microsoft.Exchange.WebServices.Data.FolderId([Microsoft.Exchange.WebServices.Data.WellKnownFolderName]::Root,$MailboxName)   
        $UsrConfig = [Microsoft.Exchange.WebServices.Data.UserConfiguration]::Bind($service, "OWA.UserOptions", $folderid, [Microsoft.Exchange.WebServices.Data.UserConfigurationProperties]::All)
        if ($UsrConfig.Dictionary) {
            if($UsrConfig.Dictionary.ContainsKey("isDarkModeTheme")){
                if($Disable.IsPresent){
                    $UsrConfig.Dictionary["isDarkModeTheme"] = $false
                }else{
                    $UsrConfig.Dictionary["isDarkModeTheme"] = $true                    
                }
               
            }else{
                if(!$Disable.IsPresent){
                    $UsrConfig.Dictionary.Add("isDarkModeTheme",$true)
                }
            }
        }
        $UsrConfig.Update()

I've put together a simple script that wraps the above and Oauth authentication and provides two cmdlets for getting and setting Darkmode for Outlook on the Web for a mailbox. Eg 

To Get the Current Dark Mode setting use

 Get-DarkModeSetting -MailboxName mailbox@domain.com

To Enable Dark Mode use

Set-DarkModeSetting -MailboxName mailbox@domain.com (will return Get-DarkModeSetting after the update)

To disable Dark Mode use

Set-DarkModeSetting -MailboxName mailbox@domain.com -Disable

I've put the script up on GitHub https://github.com/gscales/Powershell-Scripts/blob/master/DarkModeMod.ps1 

Thursday, July 11, 2019

Script to retrieve all the Office365 (Azure) Tenants an account is associated with

The Azure AD business-to-business (B2B) platform is the underlying tech that drives both guest access in Microsoft Teams,Office365 Groups and also other Azure resources. If your someone who collaborates across a number of different organization or potentially different Community or even school or university groups you might find your MSA or Office365 account starts to accumulate Guest access to different tenants. Depending on what type of access your accruing eg If it just Microsoft Teams access you will see your guest tenancies when logging on to Teams, another way is if you logon to the Azure Portal and hit switch directory you will get a list of Azure Directories your account has an association with. 

However and easier way of doing this is using PowerShell along with the ADAL, Azure Resource Management and Graph API's. I put together the following script that uses first the Azure Resource Management API to make a request that gets all the Tenants associated with your account (same as what you would see if you hit switch directory).


$UserResult = (Invoke-RestMethod -Headers $Header 
  -Uri ("https://management.azure.com/tenants?api-version=2019-03-01&`
  $includeAllTenantCategories=true") -Method Get -ContentType "Application/json").value

This returns all the Tenants associated with your user and a lot of information about the domains for the tenants your guesting into eg



The domains information was a little interesting especially seeing all the other domains people had associated in their tenants where I was a Guest. While domains aren't private information finding the bulk of domains a particular organization is associated with isn't that easy (from an outside perspective). For the rest of the script I added some code that would authenticate as a Guest into any tenants my account was associated with and then using the Graph API try to query any Teams or Office365 groups this account is associated with and then try to query for the last 2 conversation items in those Groups/Teams. The script adds information about the Groups to the Tenant object returned for each Tenant the account is associated with and then the conversations to each of the Groups in the object returned. eg so i can do something like this


I've put the script up on GitHub at https://github.com/gscales/Powershell-Scripts/blob/master/Get-TenantsForUser.ps1 this script requires the ADAL dll's for the authentication and it should be located in the same directory as the script is being run from.


Thursday, May 30, 2019

Sending a Cloud Voice mail using the Microsoft Graph or EWS from Powershell

Back in February of this year Microsoft announced the retirement of Exchange UM services in favour of Cloud voicemail that has been around for a while. Both of these services use your Exchange mailbox to store voicemail messages while Exchange UM had some nicer UI elements for playing and previewing voicemail messages and some different features, cloud voicemail just comes as a message with a standard Mp3 attachment in the Mailbox. If your using Cloud voicemail with Microsoft teams you do get a play control like


also the Cloud Voicemail service has an Audio transcription service which is reasonably accurate.

Accessing Voice Mails programmatically

Accessing these voice messages is pretty easy they are all just messages in your mailbox (usually the Inbox but they can have been moved to other folders) that have a MessageClass of IPM.Note.Microsoft.Voicemail.UM.CA ref https://docs.microsoft.com/en-us/openspecs/exchange_server_protocols/ms-oxoum/102b3a8b-1aad-4f29-90a3-998262d9fa26 . By default in the Exchange Mailbox there is a WellKnownFolder (in this instance its a Search Folder) called VoiceMail that can then be used in either the Microsoft Graph or EWS to return these voice mail messages. eg for the Microsoft Graph your request should look something like

https://graph.microsoft.com/v1.0/users('gscales@datarumble.com')/mailFolders/voicemail/messages
?$expand=SingleValueExtendedProperties(
$filter=Id%20eq%20'Integer%20%7B00020328-0000-0000-C000-000000000046%7D%20Id%200x6801'%20
or%20Id%20eq%20'String%20%7B00020386-0000-0000-C000-000000000046%7D%20Name%20X-VoiceMessageConfidenceLevel'%20
or%20Id%20eq%20'String%20%7B00020386-0000-0000-C000-000000000046%7D%20Name%20X-VoiceMessageTranscription')&
$top=100&$select=Subject,From,Body,IsRead,Id,ReceivedDateTime 



In my request I'm including the following 3 extended properties that the Teams client also needs to have (actually creating a voice message without these properties will break the voicemail section of the Teams client which is a bug on the Microsoft side and one which 3rd party Migration vendors will need to watch out for). The three properties are firstly

PidTagVoiceMessageDuration - https://docs.microsoft.com/en-us/openspecs/exchange_server_protocols/ms-oxoum/970d4c1c-dcc5-44d2-aab3-16e37805b953 which is the length of the voicemail in seconds. A quick PowerShell trick of getting the length of the voicemail from an Mp3 file is to use the Shell Applicaiton eg

        $shell = New-Object -COMObject Shell.Application
        $folder = Split-Path $Mp3FileName
        $file = Split-Path $Mp3FileName -Leaf
        $shellfolder = $shell.Namespace($folder)
        $shellfile = $shellfolder.ParseName($file) 
        $dt = [DateTime]::ParseExact($shellfolder.GetDetailsOf($shellfile, 27), "HH:mm:ss",[System.Globalization.CultureInfo]::InvariantCulture);
        $dt.TimeOfDay.TotalSeconds


The other two properties are Internet Header properties so could be either returned by requesting all the Internet Headers on the message or using the Extended properties definition

X-VoiceMessageTranscription - currently undocumented but seems to contain the transcription of the voice message (The transcription is also included in the Body of the Message)

and

X-VoiceMessageConfidenceLevel - currently undocumented

I've put together a script which is posted here that will return Voicemail from a Mailbox including the above information using the Microsoft Graph along with the ADAL library for authentication. The script is located https://github.com/gscales/Powershell-Scripts/blob/master/VoiceMailGraph.ps1 eg in action it looks like

Get-VoiceMail -MailboxName e5tmp5@datarumble.com | select @{n="Sender";e={$_.from.emailAddress.address}},Subject,PidTagVoiceMessageDuration,x* |ft

Which will give an output like


To run this type of script will require an App registration with at least Mail.Read or Mail.Read.Shared if you want to access mailboxes other then your own.


Sending Voice Mails programmatically

To Send a VoiceMail message using the Microsoft Graph or EWS is just a matter of sending a Message and setting the ItemClass to IPM.Note.Microsoft.Voicemail.UM.CA, attach the MP3 file and then setting the 3 properties I mentioned above as well as an additional property PidTagVoiceMessageAttachmentOrder. Eg the following is an example of a Microsoft Graph request to do this.



POST https://graph.microsoft.com/v1.0/users('gscales@datarumble.com')/sendmail HTTP/1.1

  "Message" : {
"Subject": "Voice Mail (27 seconds)"
,"Sender":{
 "EmailAddress":{
  "Name":"gscales@datarumble.com",
  "Address":"gscales@datarumble.com"
}}
,"Body": {
"ContentType": "HTML",
"Content": "<html><head>...</html>"
}
,"ToRecipients": [ 
      { 
 "EmailAddress":{
  "Name":"e5tmp5@datarumble.com",
  "Address":"e5tmp5@datarumble.com"
}}
  ]
,  "Attachments": [ 
    {
     "@odata.type": "#Microsoft.OutlookServices.FileAttachment",
     "Name": "audio.mp3",
     "ContentBytes": "..."
    } 
  ]
,"SingleValueExtendedProperties": [
{
"id":"String 0x001A", 
"Value":"IPM.Note.Microsoft.Voicemail.UM.CA"
 } 
,{
"id":"Integer {00020328-0000-0000-c000-000000000046} Id 0x6801", 
"Value":"27"
 } 
,{
"id":"String {00020386-0000-0000-C000-000000000046} Name X-VoiceMessageConfidenceLevel", 
"Value":"high"
 } 
,{
"id":"String {00020386-0000-0000-C000-000000000046} Name X-VoiceMessageTranscription", 
"Value":"one two three"
 },
,{
"id":"String 0x6805", 
"Value":"audio.mp3"
 }  
]
}   ,"SaveToSentItems": "true"
}

Depending on where you got the Mp3 your sending as a voice message you will need to do the transcription yourself eg Azure cognitive services can be used to do this https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/speech-to-text . Or if you want to generate a MP3 from some text you have using PowerShell you can use this cool script that Adam Bertram posted https://mcpmag.com/articles/2019/05/23/text-to-speech-azure-cognitive-services.aspx . I've created a script that you can use to Send a Voicemail using the Microsoft Graph API using the ADAL libraray for Authentication https://github.com/gscales/Powershell-Scripts/blob/master/VoiceMailGraph.ps1  . To use this try something like the following

Send-VoiceMail -MailboxName gscales@datarumble.com -ToAddress e5tmp5@datarumble.com -Mp3FileName C:\temp\SampleAudio_0.4mb.mp3 -Transcription "one two three"

When sending a Voicemail it includes the following HTML table in a Message that contains the sending users personal information eg

To fill out this information in the script I'm making another graph call to look at the users information based on the email address (technically you could use the /me if you never going to send as another user). So the script also makes the following Graph query to get this information

https://graph.microsoft.com/v1.0/users?$filter=mail%20eq%20'gscales@datarumble.com'&$Select=displayName,businessPhones,mobilePhone,mail,jobTitle,companyName

To Send a Message will requires an App Registration with Mail.Send as well as User.Read to be able to read the directory properties required to fill out the Body of the Message with the sending users phone, title and company information.

Doing the same in EWS

You can do the same as above using EWS, instead of the userdetail query the same information can be retrieved using the ResolveName operation in EWS. I've put a sample module for geting and sending voicemail using EWS on Github here informationhttps://github.com/gscales/Powershell-Scripts/blob/master/VoiceMailEWS.ps1