Monday, December 09, 2019

Update to ExchangeContacts Module to support Modern Auth,Exporting all Contacts to a VCF file (or CSV) ,NON_IPM root folder,hidden contact folders and dumpster exports

I've done some updating of my ExchangeContacts PowerShell module to support the following

  1. Modern Authentication in Office365 (distributing the ADAL dll with this module)
  2. Compiled and distributed the latest version of the EWS Managed API from GitHub with the module
  3. New cmdlet Export-EXCContacts that supports exporting all contacts in a Folder to a single VCF File
  4. New cmdlet Export-EXCContacts that supports exporting all contacts to a CSV file (this was already possible with the ExportFolder cmdlet but this is a slightly enhanced format)
  5. New cmldet Export-EXCRootContacts lets you export the Non_IPM Subtree folders that contain contacts. (Some of these are created by the Office365 substrate process) for example mycontacts, AllContacts, ContactSearch folders etc. Include dedup code based on Email Address in this cmdlet
  6. This is already supported but I wanted to show how you can export the Hidden Contacts Folder likes Recipient Cache, Gal and Organizational Contacts
  7. New cmdlet Get-EXCDumpsterContacts get the contacts that are in the RecoverableItems Deletions or Purges Folder
  8. New cmdlet Export-EXCDumpsterContacts Exports the contacts that are in the RecoverableItems Deletions or Purges Folder to a single VCF or csv file

Using Modern Authentication

As Basic Authentication in EWS is going away soon in Office365 I've enabled Modern Auth for this module using the ADAL dll which gets distributed via the bin directory in the Module. I didn't enabled it by default because it would cause issues with OnPrem Exchange so to use Modern Auth you just need to use the -ModernAuth switch. You can still pass in the PSCredential object with the -ModernAuth switch and oAuth will still be used vai the username and password grant to allow for silent auth. There is also provision to pass in your own client id for custom app registrations with the -ClientId parameter eg a simple example for using ModernAuth is 



Get-EXCContact -MailboxName gscales@datarumble.com
 -EmailAddress what@what.com -ModernAuth
Export-EXCContacts

Export-EXCContacts supports exporting all the contacts from any folder in a Mailbox to a single VCF file or a CSV File. (EWS provides the VCF nativly for Mailbox contacts so this cmlet hanldes streaming them out to a single file). Eg here are some examples

Exporting to a single VCF

Export-EXCContacts -Folder "\Contacts" -MailboxName gscales@datarumble.com
 -ModernAuth -FileName c:\temp\exp.vcf
or to a CSV


Export-EXCContacts -Folder "\Contacts" -MailboxName gscales@datarumble.com
 -ModernAuth -FileName c:\temp\exp.csv -ExportAsCSV
Export-EXCRootContacts

Export-EXCRootContacts supports exporting contacts from the NON_IPM_Subtree folders in a Mailbox. Typically folders here are created by either a Client like Outlook or OWA, other Office365 substrate process (eg Microsoft Teams) or other third party apps where they want the data to be hidden from the user. Examples of these folders would be Allcontacts, mycontacts etc. I've added this more for educational and diag purposes and Included some dedup code to deduplicate exports based on the EmailAddress. An example of export the AllContacts Folder to a CSV file


Export-EXCRootContacts -MailboxName gscales@datarumble.com -FolderName AllContacts -FileName c:\temp\allContacts.csv
-ExportAsCSV -ModernAuth

Get-EXCDumpsterContacts


This cmdlet will query either the RecoverableItemsDeletions or RecoverableItemsPurges folders in a Mailbox (Dumpster v2 folder) and it will get any contacts that exist in these folders and return them as EWSContact objects. (You can then process them further eg copy,move etc)

eg


Get-EXCDumpsterContacts -MailboxName gscales@datarumble.com -ModernAuth -Purges
or purges




Get-EXCDumpsterContacts -MailboxName gscales@datarumble.com -ModernAuth 
Export-EXCDumpsterContacts

This cmdlet builds on Get-EXCDumpsterContacts and allows you to export what is returned to either a single VCF file or a CSV file. (same logic as Export-EXCContacts)



Export-EXCDumpsterContacts -MailboxName gscales@datarumble.com -ExportAsCSV
 -FileName c:\temp\dumpsterDeletions.csv
or purges


Export-EXCDumpsterContacts -MailboxName gscales@datarumble.com -purges -ExportAsCSV
 -FileName c:\temp\dumpsterPurges.csv


Exporting  hidden Contacts folders

One last thing I wanted to demonstrate with this module is the ability to export the Hidden contact folders in your mailbox, if you have ever peeked at the Contacts folder subfolder hierarchy in a MAPI editor like MFCmapi there are a number of Hidden folders eg


Folders like Recipient Cache, Gal Contacts and Organizational Contacts folder all serve different client specific tasks (that do go wrong sometimes). So you can use this module to export the contacts in these folders to a CSV for any troubleshooting, migration or personal interest needs.

Here are some examples of exporting contacts from those folders to a csv file


Export-EXCContacts -Folder "\Contacts\Organizational Contacts"
-MailboxName gscales@datarumble.com -ModernAuth -FileName c:\temp\exp.csv 
-ExportAsCSV

The new module can be found on the Powershell Gallery https://www.powershellgallery.com/packages/ExchangeContacts/1.6.0.0 and the source is available here on GitHub https://github.com/gscales/Powershell-Scripts/tree/master/EWSContacts/Module

Thursday, November 28, 2019

Creating a year at a glance Calendar (in Excel) from aggregated Shared Calendars in Exchange Online within a Microsoft Teams Tab app using the Microsoft Graph

Calendaring formats in Messaging Clients tend to all follow much the same approach whether its Outlook, Outlook on the Web or Mobile, Gmail or Microsoft Teams.  Like email data, calendar data can be random and complex in its volume and type (recurring appointments etc) so a simple year at a glance calendar for someone designing a mass market client is hard to do well for all the data types and volumes that you could encounter, therefor its not something you see in mail clients by default (lets face it who wants to support that).

In the absence of year at a glance calendars  I was surprised to see people using Excel to create yearly aggregated calendars in Microsoft Teams for events (for data that already existed in shared Calendars). But more surprisingly is that it actually kind of worked well when there wasn't a lot of data that needed to be shown. The one thing that sprang to my mind was if you could automate this it  would be really good for people who use the Birthday calendar feature in Outlook, simple Company events calendars and also public holidays calendars especially when you want to aggregate multiple countries public holidays in a simple spreadsheet to help people like me who work across multiple regions and then share that within a team.

So I thought I'd set out to build a simple Microsoft Teams Tab application that could create an aggregated Spreadsheet of events from any calendar (or calendars) in a Office365 Mailbox that was shared to a Particular Microsoft Teams (Group) using the Graph API to get the Calendar Data from the Mailboxes and also using the Graph API to build the Excel workbook using the workbook functionality that the graph has. The result is then stored in a OneDrive File and provided back to the user in a iFrame as and embedded Excel Online spreadsheet. And the end result looks something like this (this is the result of having a Shared Mailbox with the Holiday calendars added/imported for Australia, US and the UK and that Mailbox being Shared to the Group/Teams)


How it works

Like the other Teams Tab apps I've written it takes advantage of using the Teams tab silent Auth method documented here . Once the code has acquired an Access Token to access the Graph it can get to work.

Configuration 

For this application to work I needed to be able to store the configuration of the calendars I wanted to aggregate . As the app is written in JS the easiest form of config file was a straight JSON file like the following
{
    "Calendars": [
        {
            "CalendarEmailAddress": "mb1@datarumble.com",
            "CalendarName": "Australia holidays",
            "CalendarDisplayName": "Australia"
        },
        {
            "CalendarEmailAddress": "mb1@datarumble.com",
            "CalendarName": "United States holidays",
            "CalendarDisplayName": "United States"
        }
    ]
}


And then I just required a way of storing and retrieving the file (a todo would be to create a nice form to allow people to create and edit the config but if I had time ...). The Teams client Sdk (and tab apps) don't have any provision for storing custom configuration, properties or pretty much anything configuration related so I just went for putting the file in the Channel document library as a starting point. So next I just needed some Graph code to grab the contents of that file. In JS the easiest way i found to do this was like this

From the Teams Context interface you can get the GroupId and ChannelName where you tab is executing so you can the construct the following URL that can be used in the Get against the MS Graph.

v1.0/groups/" + GroupId + "/drive/root:/" + channelName + "/ExcelCalendarConfig.json

The Graph documentation points to using the /content  endpoint to download the contents of a file, I have used this before in .NET (and node.js) and it works okay, it returns a 302 response with a Location header that can be followed to the SharePoint site. In client side JS its a lot messier so I found it easier to do this

CCDriveItem = await GenericGraphGet(Token,CalendarConfigURL);        
var CCFetch = await fetch(CCDriveItem["@microsoft.graph.downloadUrl"]);

So the @microsoft.graph.downloadUrl is a short-lived URL for the file that doesn't need authentication. So its easy to just do a Get and then use fetch on this url to return the JSON back to the code and I don't have to wade through a bunch of URL follow and cors issues with ajax and fetch

Template

One of the things that the Graph API can't do is create a new Excel file from scratch so you have to have an existing file you want to create a session with or some people recommend a number of different libraries to create the file. An easy solution for this one for me was to create a blank Excel file with no metadata and include that in with the webfiles so I could just copy it to OneDrive as a template file (overwriting any existing older file that may have been there) and then use that.

Storing the Result File 

One other problem for this project was where to store the end result file, at first I just used the SharePoint library associated with the Teams Channel but there where problems around the file becoming locked easily if two people ran it simultaneously. I also wanted to be able to run this with the least amount of permission as possible so the users App Folder (for this Tab app) seemed like the best spot as a starting point which is what the following code handles.


let AppDrive = await GenericGraphGet(Token,"https://graph.microsoft.com/v1.0/me/drive/special/approot");
let FileData = await ReadTemplate();
var fileName = "Calendars.xlsx";
var UploadURL = "https://graph.microsoft.com/v1.0/me/drive/special/approot:/" + fileName + ":/content";
let NewFile = await CreateOneDriveFile(Token,UploadURL,FileData);    

Getting the Calendars

Getting the Calendars was probably the easiest task, from the config file the CalendarName property is used to find the Folder from the Mailbox you want to access the data from. The query of the Calendar is then done for a the current years data using a CalendarView (which will expand any recurring calendar appointments). To aggregate the calendar data that was retrieved into orderable lists I used multiple Map objects in JS,loop iterations and arrays so I get an ordered list of events that are aggregated by first the Month and then day within the Month.

Building the Spreadsheet 

To build the spreadsheet in the output format that I wanted (which mirrored what I saw users doing manually) I had to first insert the data, then merge the month rows so I only had 1 row per month. Then format the merge so the text was aligned correctly and had the correct formatting. And then lastly was to Autofit the columns so the spreadsheet displayed correctly to users. So this required a lot of separate request to the Graph API to do which at first ran a little slowly. Then came Batching

Batching

Batching really is a Godsend when it comes to performance with a task like this, for example my original code had around 40-50 individual request to get the data and formatting done and with batching it was reduced to around 6 (and I was being a little conservative and could have reduced this). The big tip for using batching with the WorkBook endpoint is that you need to make sure you include the workbook session id with ever request (just not the batch request). If you don't you will get a lot of EditModeCannotAcquireLockTooManyRequests  which the documentation,the error (and the internet in general) aren't really helpful in pointing out why this happened.

Displaying it back to the Teams tab

This turned out to be one of the hardest problems to solve and is one of the outstanding issues with this in Teams anyway. I used an Iframe and generated and embeed link (which is what you get when you use Share-embed in Excel Online). This does work okay in the browser as long as you already have a login to your personal OneDrive (token in the Cache) else you will be prompted to logon to SharePoint. In the Desktop client this logon is a problem so instead of opening within the Tab in the desktop client, if it detects the Desktop client it launchs a new browser tab (which you may or may not need to logon to SharePoint to view).  This was a little disappointing but probably something I'll have a fix for soon (If anybody has any suggestions I'm all ears)

GitHub Repo for this App

I have a hosted version of this Tab App on my GitHub pages on https://gscales.github.io/TeamsExcelCalendar/  and there is repo version inside the Aggregation engine repo https://github.com/gscales/ExcelCalendarAggregate/tree/master/TeamsExcelCalendar with a Readme that details the installation process

Building on the Aggregation engine

Because I kind of enjoy taking things and running with them I have some plans of using the Calendar to Excel aggregation engine in a few different formats. The first will be a Simple powershell script so you can do the same thing but all from with an Automation context so if your interested in this but don't want a Teams tab app watch this space.


   

Wednesday, October 30, 2019

How to test SMTP using Opportunistic TLS with Powershell and grab the public certificate a SMTP server is using

Most email services these day employ Opportunistic TLS when trying to send Messages which means that wherever possible the Messages will be encrypted rather then the plain text legacy of SMTP.  This method was defined in RFC 3207 "SMTP Service Extension for Secure SMTP over Transport Layer Security" and  there's a quite a good explanation of Opportunistic TLS on Wikipedia https://en.wikipedia.org/wiki/Opportunistic_TLS .  This is used for both Server to Server (eg MTA to MTA) and Client to server (Eg a Message client like Outlook which acts as a MSA) the later being generally Authenticated.

Basically it allows you to have a normal plain text SMTP conversation that is then upgraded to TLS using the STARTTLS verb. Not all servers will support this verb so if its not supported then a message is just sent as Plain text. TLS relies on PKI certificates and the administrative issues that come around certificate management like expired certificates which is why I wrote this script. Essentially I wanted to see the Public Certificate that was in used by Recipient SMTP server and couldn't find any easy to use method to get it. Eg in a web browser you can always view a certificate to check its authenticity, but with SMTP there aren't a lot of good tools around for this, you can use Telnet to test in Plan text a SMTP server, but its not easy to retrieve the TLS public certificate from the server for inspection over Telnet (or using something like putty etc).

In PowerShell this is pretty easy back in 2006 I wrote a plain text SMTP test script https://gsexdev.blogspot.com/2006/09/doing-smtp-telnet-test-with-powershell.html and a variant to do alerting on verbs https://gsexdev.blogspot.com/2007/06/testing-smtp-verbs-and-sending-alert.html  so this is more just a modern version of this with the addition of using the System.Net.Security.SslStream class that supports creating the TLS connection and also allows you to easily export the recipient servers Public cert eg

        Write-Host("STARTTLS") -ForegroundColor Green
        $streamWriter.WriteLine("STARTTLS");
        $startTLSResponse = $streamReader.ReadLine();
        Write-Host($startTLSResponse)
        $ccCol = New-Object System.Security.Cryptography.X509Certificates.X509CertificateCollection
        $sslStream.AuthenticateAsClient($ServerName,$ccCol,[System.Security.Authentication.SslProtocols]::Tls12,$false);        
        $Cert = $sslStream.RemoteCertificate.Export([System.Security.Cryptography.X509Certificates.X509ContentType]::Cert);
        [System.IO.File]::WriteAllBytes($CertificateFilePath, $Cert);

I've created a small Powershell Script module that has two cmdlets the first is called Get-SMTPTLSCert which can be used to get the Public cert being used by the SMTP endpoint eg for Gmail you could use Get-SMTPTLSCert -ServerName smtp.gmail.com -Sendingdomain youdomain.com -CertificateFilePath c:\temp\gmailpubCer.cer By default this uses the client submission port 587 SMTP-MSA (Port 25 is often blocked from most locations) so its testing client(Message Submission Agent) to server (rather then server to server between to SMTP Mesage Transfer Agents). The Sending domain is required becuase most SMTP servers don't allows a empty helo/ehlo statement. I've also included a cmldet "Invoke-TestSMTPTLS" that does a test of the SMTP server and does Authentication if necessary. eg usually to use Port 587 you need to authenticate on the SMTP server. So in this instance once you have used the STARTTLS verb to upgrade the Plain text conversation to TLS you can then use the AUTH LOGIN verb to submit the username and password as base64 strings to authenticate eg this is what the Auth looks like in script
            $command = "AUTH LOGIN" 
            write-host -foregroundcolor DarkGreen $command
            $SSLstreamWriter.WriteLine($command) 
            $AuthLoginResponse = $SSLstreamReader.ReadLine()
            write-host ($AuthLoginResponse)
            $Bytes = [System.Text.Encoding]::ASCII.GetBytes($Credentials.UserName)
            $Base64UserName =  [Convert]::ToBase64String($Bytes)              
            $SSLstreamWriter.WriteLine($Base64UserName)
            $UserNameResponse = $SSLstreamReader.ReadLine()
            write-host ($UserNameResponse)
            $Bytes = [System.Text.Encoding]::ASCII.GetBytes($Credentials.GetNetworkCredential().password.ToString())
            $Base64Password = [Convert]::ToBase64String($Bytes)     
            $SSLstreamWriter.WriteLine($Base64Password)
            $PassWordResponse = $SSLstreamReader.ReadLine()
            write-host $PassWordResponse    

So the above code takes a PSCredential object and changes that into necessary SMTP verbs to authenticate.  So run against office365 this looks like


(The base64 values above decode to Useraname: and Password: )

This cmdlet doesn't actually send a Message it just invokes the envelope verbs and no Data verb is sent (where the MIME message would go). I've put a copy of the script on GitHub here https://github.com/gscales/Powershell-Scripts/blob/master/TLS-SMTPMod.ps1


Thursday, October 17, 2019

Doing Mailbox Change discovery with an EWS PowerShell Script

Mailbox Change discovery is the process of looking at any folders or items that are new or have been modified recently in a Mailbox. Its useful in a number of different ways including (but not limited to)

  • Looking at what objects a third party Addin is creating or modifying in your mailbox
  • Help to work out which FAI (Folder Associated Item) is being modified when changes are made to the configuration in Outlook or Outlook on the Web (this can be useful if you then want to automate those changes in your own scripts)
  • Fixing client issues caused by corrupt or bad items (eg if you've ever used MFCMapi to delete and Item that's causing a particular client function not to work correctly)
  • Getting an understanding of how the backend scaffolding of new features work in Outlook on the Web (eg looking at what the substrate Is doing in Office365) 
If you have ever looked recently at the Non_IPM Root folder of any Office365 Mailbox you can see by the large number of folders that are used by various different apps, substrate processes as well as for new client features there is a lot going on. So this script can also help give a bit of insight on what's happening in the background when you activate or use particular features (or potentially point you to the location in the Mailbox when your looking at problems that might be occurring with certain features)
I'll go through a specific use case later looking at the "contact favourite feature (which I struggled to even find the UI documentation for)" which is what prompted me to write this script.

What this script does

The script has three main functions

  1. Enumerates every folder in the Mailbox (both IPM and NON_IPM_Subtree) as well as Search Folders and looks at the created and modified date of each folder. If they where created or modified in the lookbacktime then it's adds them to the report
  2. It then does a Scan of the Items in each Folder (excluding Search Folders) and if it finds any items that where modified or created after the lookbacktime then it adds these to the report
  3. It then does a Scan of the FAI Items (Folder Associated Items) in the Folder and again if the items where modified or created after the lookbacktime then it adds these to the report
The Output of the Report then contains information about what folders, Items and FAI Items have either been created or modified in the Mailbox in the last x number of seconds.

An Example

The best way to demonstrate this is with an Example which was the reason I wrote the script, The Contact Favourite feature in Outlook on the Web gives you the ability to click the star next to a Contacts name in OWA which then creates a favourite Shortcut eg


So I wanted to know when you did this where does the favourite item get created, what information it was storing and what other changes where happening. So this is where the following script comes into handy to find this information out all I needed to do was favourite a contact and then run the script immediately after to look at the items which changed in the Mailbox in the last 60 seconds. Eg a run of the script after I made the above change yielded a report that looked like



So from the above report you can see that firstly a new Search Folder was created 
\FavoritePersonas\Glen Scales_7d09f835-0028-4bd7-bed9-59535127bbe1 New SearchFolder



under the \FavoritePersonas\ directory and also an object of type SDS.32d4b5e5-7d33-4e7f-b073-f8cffbbb47a1.OutlookFavoriteItem was created under the folder

\ApplicationDataRoot\32d4b5e5-7d33-4e7f-b073-f8cffbbb47a1\outlookfavorites

The other thing that I included in the report was the EntryId of the Item found so in the above case i can take the EntryId for the outlookfavourite and open the Item in a Mapi editor like OutlookSpy or MFCMAPI eg


And you can then see all the MAPI properties on the Item (or delete/export etc)



That's it relatively simple to use eg

Invoke-MailboxChangeDiscovery -MailboxName mailbox@domain -secondstolookback 60

I've put this script up on GitHub here https://github.com/gscales/Powershell-Scripts/blob/master/ChangeDiscovery.ps1

Friday, October 04, 2019

Using the MSAL (Microsoft Authentication Library) in EWS with Office365

Last July Microsoft announced here they would be disabling basic authentication in EWS on October 13 2020 which is now a little over a year away. Given the amount of time that has passed since the announcement any line of business applications or third party applications that you use that had been using Basic authentication should have been modified or upgraded to support using oAuth. If this isn't the case the time to take action is now.

When you need to migrate a .NET app or script you have using EWS and basic Authentication you have two Authentication libraries you can choose from

  1. ADAL - Azure AD Authentication Library (uses the v1 Azure AD Endpoint)
  2. MSAL - Microsoft Authentication Library (uses the v2 Microsoft Identity Platform Endpoint)
the most common library you will come across in use is the ADAL libraries because its been around the longest, has good support across a number of languages and allows complex authentications scenarios with support for SAML etc. The MSAL is the latest and greatest in terms of its support for oAuth2 standards and is where Microsoft are investing their future development efforts. A good primer for understanding the difference in terms of the Tokens that both of these endpoint generate is to read https://docs.microsoft.com/en-us/azure/active-directory/develop/access-tokens

So which should you choose ? If your using PowerShell then the ADAL is the easiest to use and there are a lot of good examples for this like. However from a long term point of view using MSAL library can be a better choice as its going to offer more supportability (new features etc) going forward as long as you don't fall into one of  the restrictions described in https://github.com/AzureAD/microsoft-authentication-library-for-dotnet/wiki/Adal-to-Msal

In this post I'm going to look at using the MSAL library with EWS to access Mailboxes in Exchange Online. 

Scopes

One of the biggest differences when it comes to coding between the libraries with ADAL you specify the resource your going to use eg "https://outlook.office365.com" and with the MASL you specific the scopes you are going to use. With EWS its relatively simple in that  there are only two scopes (EWS doesn't allow you to constrain your access to different mailbox item types) which you would first need to allow in your Application registration which can be found in the Supported Legacy API's section of the application registration(make sure you scroll right to the bottom)

Delegated Permissions


Application Permissions (where you going to use AppOnly Tokens)

.default Scope

For v.1 apps you can get all the static scopes configured in an application using the .default scope so for ews that would look something like https://outlook.office365.com/.default . When your using App Only tokens this becomes important.


App registration and Consent

One of the advantages of the MSAL library is dynamic consent which for EWS because in practice your only going to be using one scope it doesn't have much use. However if your also going to be using other workloads you maybe able to take advantage of that feature. For the app registration you need to use the v2 Endpoint registration process (which is the default now in the Azure portal) see https://docs.microsoft.com/en-us/graph/auth-register-app-v2. This also makes it easy to handle the consent within a tenant.

Getting down to coding

In the ADAL there was only one single class called the AuthenticationContext which you used to request tokens. In the MSAL you have the PublicClientApplication (which you use for standard user authentication) and ConfidentialClientApp which gets used for AppOnly tokens and On-Behalf-Of flow.

Endpoints 

With the v2 Endpoint you have the option of allowing
  1. common
  2. organizations
  3. consumers
  4. Tenant specific (Guid or Name)
For EWS you generally always want to use the Tenant specific endpoint which means its best to either dynamically get the TenantId for the tenant your targeting or hard code it . eg you can get the TenantId need with 3 lines of C#


string domainName = "datarumble.com";
HttpClient Client = new HttpClient();
var TenantId = ((dynamic)JsonConvert.DeserializeObject(Client.GetAsync("https://login.microsoftonline.com/" + domainName + "/v2.0/.well-known/openid-configuration")
     .Result.Content.ReadAsStringAsync().Result))
    .authorization_endpoint.ToString().Split('/')[3];
In PowerShell you can do it with

$TenantId = (Invoke-WebRequest https://login.windows.net/datarumble.com/v2.0/.well-known/openid-configuration | ConvertFrom-Json).token_endpoint.Split('/')[3]

Delegate Authentication in EWS with MSAL and the EWS Managed API

This generally is the most common way of using EWS where your authenticating as a standard User and then accessing a Mailbox. If its a shared Mailbox then access will need to be have granted via Add-MailboxFolderPermission or you using EWS Impersonation

This is the simplest C# example of an Auth using the MSAL library in a Console app to logon (the currently logged on user).

 string MailboxName = "gscales@datarumble.com";
 string scope = "https://outlook.office.com/EWS.AccessAsUser.All";
 string redirectUri = "msal9d5d77a6-fe09-473e-8931-958f15f1a96b://auth";
 string domainName = "datarumble.com";

 HttpClient Client = new HttpClient();
 var TenantId = ((dynamic)JsonConvert.DeserializeObject(Client.GetAsync("https://login.microsoftonline.com/" + domainName + "/v2.0/.well-known/openid-configuration")
  .Result.Content.ReadAsStringAsync().Result))
  .authorization_endpoint.ToString().Split('/')[3];

 PublicClientApplicationBuilder pcaConfig = PublicClientApplicationBuilder.Create("9d5d77a6-fe09-473e-8931-958f15f1a96b")      
  .WithTenantId(TenantId);
   
 pcaConfig.WithRedirectUri(redirectUri);
 var TokenResult = pcaConfig.Build().AcquireTokenInteractive(new[] { scope })
  .WithPrompt(Prompt.Never)
  .WithLoginHint(MailboxName).ExecuteAsync().Result;

 ExchangeService service = new ExchangeService(ExchangeVersion.Exchange2016);
 service.Url = new Uri("https://outlook.office365.com/ews/exchange.asmx");
 service.Credentials = new OAuthCredentials(TokenResult.AccessToken);
 service.HttpHeaders.Add("X-AnchorMailbox", MailboxName);

 Folder Inbox = Folder.Bind(service, WellKnownFolderName.Inbox);


AppOnly Tokens

This is where your Application is authenticating using a App Secret or SSL certificate, after this your App will get full access to all Mailboxes in a tenant (it important to not that the scoping feature https://docs.microsoft.com/en-us/graph/auth-limit-mailbox-access doesn't work with the EWS so you need to be using the Graph or Outlook api).

 string clientId = "9d5d77a6-fe09-473e-8931-958f15f1a96b";
 string clientSecret = "xxxx";
 string mailboxName = "gscales@datarumble.com";
 string redirectUri = "msal9d5d77a6-fe09-473e-8931-958f15f1a96b://auth";
 string domainName = "datarumble.com";
 string scope = "https://outlook.office365.com/.default";

 HttpClient Client = new HttpClient();
 var TenantId = ((dynamic)JsonConvert.DeserializeObject(Client.GetAsync("https://login.microsoftonline.com/" + domainName + "/v2.0/.well-known/openid-configuration")
  .Result.Content.ReadAsStringAsync().Result))
  .authorization_endpoint.ToString().Split('/')[3];

 IConfidentialClientApplication app = ConfidentialClientApplicationBuilder.Create(clientId)
  .WithClientSecret(clientSecret)
  .WithTenantId(TenantId)
  .WithRedirectUri(redirectUri)
  .Build();

  
 var TokenResult = app.AcquireTokenForClient(new[] { scope }).ExecuteAsync().Result;
 ExchangeService service = new ExchangeService(ExchangeVersion.Exchange2016);
 service.Url = new Uri("https://outlook.office365.com/ews/exchange.asmx");
 service.Credentials = new OAuthCredentials(TokenResult.AccessToken);
 service.HttpHeaders.Add("X-AnchorMailbox", mailboxName);
 service.ImpersonatedUserId = new ImpersonatedUserId(ConnectingIdType.SmtpAddress, mailboxName);
 Folder Inbox = Folder.Bind(service, new FolderId(WellKnownFolderName.Inbox, mailboxName));


Token Refresh

One of the big things missing in the EWS Managed API is a callback before each request that checks for an expired Access Token. Because tokens are only valid for 1 hour if you have a long running process like a migration/export or data analysis then you need to make sure that you have some provision in your code to track the expiry of the access token and the refresh the token when needed.

Doing this in PowerShell

If your using PowerShell you can use the same code as above as long as import the MSAL library dll into your session

Some simple auth examples for this would be

 Delegate Authentication 


$MailboxName = "gscales@datarumble.com";
$ClientId = "9d5d77a6-fe09-473e-8931-958f15f1a96b"
$scope = "https://outlook.office.com/EWS.AccessAsUser.All";
$redirectUri = "msal9d5d77a6-fe09-473e-8931-958f15f1a96b://auth";
$domainName = "datarumble.com";
$Scopes = New-Object System.Collections.Generic.List[string]
$Scopes.Add($Scope)
$TenantId = (Invoke-WebRequest https://login.windows.net/datarumble.com/v2.0/.well-known/openid-configuration | ConvertFrom-Json).token_endpoint.Split('/')[3]
$pcaConfig = [Microsoft.Identity.Client.PublicClientApplicationBuilder]::Create($ClientId).WithTenantId($TenantId).WithRedirectUri($redirectUri)
$TokenResult = $pcaConfig.Build().AcquireTokenInteractive($Scopes).WithPrompt([Microsoft.Identity.Client.Prompt]::Never).WithLoginHint($MailboxName).ExecuteAsync().Result;

AppOnly Token


$ClientId = "9d5d77a6-fe09-473e-8931-958f15f1a96b"
$MailboxName = "gscales@datarumble.com"
$RedirectUri = "msal9d5d77a6-fe09-473e-8931-958f15f1a96b://auth"
$ClientSecret = "xxx";
$Scope = "https://outlook.office365.com/.default"
$TenantId = (Invoke-WebRequest https://login.windows.net/datarumble.com/v2.0/.well-known/openid-configuration | ConvertFrom-Json).token_endpoint.Split('/')[3]
$app =  [Microsoft.Identity.Client.ConfidentialClientApplicationBuilder]::Create($ClientId).WithClientSecret($ClientSecret).WithTenantId($TenantId).WithRedirectUri($RedirectUri).Build()
$Scopes = New-Object System.Collections.Generic.List[string]
$Scopes.Add($Scope)
$TokenResult = $app.AcquireTokenForClient($Scopes).ExecuteAsync().Result;