Splunking Microsoft Cloud Data: Part 3

This is Part 3, in a series of step-by-step guides for accessing, configuring and retrieving all the valuable intel from Microsoft Cloud Services.

Part 1 & Part 2 stepped us through all the inputs of the Add-on for Microsoft Cloud Services. Today we’re going to look at message tracking logs from Microsoft Exchange Online (EOL).

Exchange message tracking logs record email message activity as they flow through the transport pipeline on exchange mail servers. These are particularly helpful not only for exchange troubleshooting and diagnosing, but also from a security operations perspective:

  • Troubleshoot a message that was sent by a user to a specific recipient.
  • Find out if a transport rule acted on a message.
  • Find out if a message sent from an Internet sender made it into your Exchange organization.
  • Correlate sender domains against threat intelligence or look for non-standard senders

Now when I first started writing this month’s post, I came up with a few different options for ingesting the logs. This included PowerShell scripts, Azure Runbooks and some other funky ideas.

However to make all of our lives easier, some very clever Splunkers have created the Microsoft Office 365 Reporting Add-on for Splunk.

With this in mind, we’ll cover 2 options:

  1. Microsoft Office365 Reporting Add-on
  2. Azure Runbook automation using Splunk and the HTTP Event Collector

Option 1: Microsoft Office365 Reporting Add-on

This is the easiest and by far the quickest way to ingest exchange tracking logs.

1) Install the Microsoft Office 365 Reporting Add-on for Splunk.

2) In the Add-on Select Configuration, select Add.

3) Enter Name, Username, and Password. Select Add.

4) Select the Inputs tab, select Create New Input.

5) Enter Name, Interval, select Index, Office365 Account, enter Start date/time and select Add.

Note: Depending on the size of the environment, you may run into issues with Azure limits when trying to retrieve too many previous events. If historical data is not essential, set the Start date/time of the current day.

6) Once input is configured, data should be populated in the ms:o365:reporting:messagetrace sourcetype.

Easy, right?!


The add-on authenticates via Microsoft sign-in so ensure your Splunk host can communicate to the usual endpoints (,, etc.).

Enable debug logging under Configuration > Logging and run the following search:

index=_internal sourcetype="ta_ms_o365_reporting_ms_o365_message_trace-2"

Option 2: Azure runbook automation using Splunk HTTP Event Collector

If we’re being honest, the only reason this option is here is because I thought it was damn cool, and because the guys made the reporting add-on so simple to use that this post would be rather short otherwise... BUT this process will come in handy for future posts. :)

1) Login to your Azure account through the Azure Portal.

2) Select More services, in the search bar type automation. Select Automation Accounts.

3) Select Add, enter Name, select Subscription, choose to Create new Resource group or Use existing. Select Location and select Yes to create Azure Run As account. Select Create.

4) Once the automation account has been created we need to provide credentials for our scripts to connect to Exchange Online. Select Credentials and select Add a credential

5) User must have sufficient access to be able to access exchange message trace functions. Enter Name, Description, Username, Password, confirm Password and select Create.

Note: If you are unsure if your account has sufficient rights, login to the Exchange Admin Center. If you can run a manual message trace you should have enough rights.

6) Select Runbooks and select Add a runbook.

7) Select Create a new runbook. Enter Name, select PowerShell as the Runbook type, enter a Description and select Create.

8) Select Modules and select Browse gallery.

9) Search for msonline. Select the MSOnline module.

Note: We require this module to authenticate with Azure AD.

10) Select Import.

11) Select OK on the confirmation window. Wait until you receive a success notification.

For this example, we’re going to store a copy of the logs in a storage blob before sending to Splunk using HEC. Ignore references to storage blobs and containers if you don't want to store them.

12) Select Storage accounts, select Add, enter Name, select blob storage, Replication, Access tier, Secure transfer method and Subscription. Select your Resource group from Step 3 and Location. When finished select Create.

13) Select Storage accounts, select your account and select + Container.

14) Enter Name, select Blob and select OK. Write down Container name, we’ll need this shortly.

15) Select Access keys, write down Key and Storage account name. We’ll need these shortly.

16) In Splunk. Select Settings, Data Inputs, HTTP Event Collector. Select New Token.

17) Enter Name, Source Name and Description. Select Next.

18) Select Automatic sourcetype, specify the Indexes HEC can send to and select the Default Index. Select Review.

19) Review options and select Submit.

20) Write down the Token Value. We'll need it shortly.

Note: Ensure port 8088 is accessible on your Splunk server.

21) Select More services, in the search bar type automation. Select Automation Accounts.

22) Select your Automation account.

23) Select Runbooks, search Runbook Name from Step 7 and select Runbook.

24) Select Edit.

$containerName = "XXXX"
$storageAccountName ="XXXX"
$storageKey ="XXXX"
$SplunkEventCollectorPort ="8088"
$testFile = "test_file" + $(get-date -f MM-dd-yyyy_HH_mm_ss) + ".csv"
Get-PSSession  | ?{$_.ComputerName -like "*"} | Remove-PSSession | out-null
$TenantCredentials = Get-AutomationPSCredential -Name "XXXX"

$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri -Credential $TenantCredentials -Authentication Basic -AllowRedirection
Import-PSSession $Session -DisableNameChecking -AllowClobber | out-null

$messages = Get-MessageTrace -StartDate 09/01/2017 -EndDate 09/26/2017
$messages | Export-Csv $testFile -notype
Get-PSSession  | ?{$_.ComputerName -like "*"} | Remove-PSSession | out-null

$ctx = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageKey
Set-AzureStorageBlobContent -Container $containerName -File $testFile -Context $ctx -Force
foreach ($m in $messages) {
    $body = @{
        event =(ConvertTo-Json $m)
    $uri = "http://" + $SplunkHost + ":" + $SplunkEventCollectorPort + "/services/collector"
    $header = @{"Authorization"="Splunk " + $SplunkEventCollectorToken}
    Invoke-RestMethod -Method Post -Uri $uri -Body (ConvertTo-Json $body) -Header $header

25) Edit Container Name, Storage Account Name, Storage Key, Splunk Host, HEC Token and Credential Name (Step 5). Modify Dates, select Save and select Publish. Select Yes when prompted.

Note: If running on a schedule, dates can be configured as a calculated variable. Eg. "get-date -f MM/dd/yyyy".

26) Select Start and select Yes.

27) Select the Output tile.

28) Monitor the Output window. A successful completion will look similar to this.

29) If configured correctly, data should be populated in the Default HEC Index (Step 13).

Now this process obviously seems much more complicated than Option 1. However the main reason for detailing this option is that it opens the door to basically log anything in Azure you can script through the Runbooks. This will come in handy in future posts where we’ll target billing data, EOP reports, threat intelligence and more.

Finally, big thanks to Matthew Erbs for HEC code snippets and for imparting his PowerShell wisdom on me. 

Got a data source you’d like to see covered? Leave a comment or email me at

Stay tuned for Part 4!

Happy Splunking!

Ryan Lait
Posted by

Ryan Lait