This is Part 2 in a series of step-by-step guides for accessing, configuring and retrieving all the valuable intel from Microsoft Cloud Services.
In Part 1 of "Splunking Microsoft Cloud Data," we configured the O365 Management and Azure Audit logs using the Splunk Add-on for Microsoft Cloud Services.
Part 2: Azure storage tables, storage blobs, resources and virtual machine logs
Today, we're going to tackle four more data sources!
Super simple, super easy, super valuable. Let's go!
Hold up Ryan, What’s the difference between table and blob storage??
As the name suggests, table storage is structured/semi-structured information in a schema-based format (think csv files). Blob storage can be free-form storage—use it to store raw logs files, VHD’s, .json exports, etc.
Okay, where were we?
Firstly, we're going to need our Azure Storage Account details.
1) Login to your Azure account through the Azure portal.
2) Select Storage accounts, select the storage account you want to configure. Select Access keys. Copy Storage account name and one of the Default keys (we’ll need these shortly!).
Optional: Use a SAS token if more granular permissions are required.
3) In the add-on, select Configuration, select Azure Storage Account, select Add Azure Storage Account.
4) Enter Name, Account Name (Storage Account Name), Account Secret (Access Key). Select Add.
5) Select Inputs, Select Create New Input, Select Azure Storage Table.
6) Enter Name, select Storage Account, enter Table Name/s (use * to wildcard), modify Start Time and Interval if required, select Index and select Add.
7) Once inputs are configured, data should be populated in the mscs:storage:table sourcetype.
The process to ingest storage blob data is essentially identical to the storage table process.
8) Repeat Steps 1-4 if your blob store resides in a different storage account.
9) In the Azure portal, select Storage accounts, select your storage account, select Containers, copy Container Name. We'll need this shortly!
Optional: Copy Blob names if you only need to ingest specific files from the container.
10) Select Inputs, select Create New Input, select Azure Storage Blob.
11) Enter Name, select Storage Account, enter Container Name, specify Blob list (use * to wildcard), modify Exclusions, Decoding and Interval if required. Select Index and select Add.
12) Once inputs are configured, data should be populated in the mscs:storage:blob sourcetype.
Azure resource data is incredibly valuable, particularly in organisations with large Azure footprints. Use Resource information to keep track of online/offline VMs, IP address changes, network reconfigurations, etc.
The Azure Resource input requires an Azure App Account configured. So if you’ve made it all the way here without checking out Part 1, head over there and run through Steps 1-29...or the whole thing. ;)
13) Select Inputs, select Create New Input, select Azure Resource.
14) Enter Name, select App Account, enter Subscription ID, select Resource Type, modify Resource Group List and Interval if required. Select Index and select Add.
15) Repeat Steps 13-14 to add the other Resource Types.
16) Once inputs are configured, data should be populated in the mscs:resource:* sourcetypes.
Guest-level monitoring must be enabled on each of the VMs you want to capture data from. If this is not enabled already, check with your Azure admin. Alternatively, be a cowboy and follow the steps below.
17) In the Azure Portal, select Virtual Machines, select your VM, select Diagnostic Settings, select Enable guest-level monitoring.
18) Wait until you receive the notification Successfully updated diagnostic settings. Modify Performance counters and Event logging parameters if required.
19) Select Virtual Machines, select your VM, select Diagnostic Settings, note the Storage Account the logs are being sent to.
20) If data is being logged into a new storage account, you'll need to add the Azure Storage Account in Splunk. Repeat Steps 1-4.
21) Select Inputs, select Create New Input, select Azure Virtual Machine Metrics.
22) Enter Name, select Azure Storage Account, Index and select Add.
23) Once inputs are configured, data should be populated in the mscs:vm:metrics sourcetype.
You'll notice in Step 22, the Table List is greyed out and does not allow you to modify the list. As such, the input won't ingest the event logs from the server.
24) Create a new Azure Storage Table input. Enter Name, select Azure Storage Account containing the metrics logs. Enter WADWindowsEventLogsTable in the Table List. Modify Start Time and Interval if required. Select Index, enter a sourcetype and select Add.
Note: WADWindowsEventLogsTable is the name of the storage table created when guest-level monitoring is enabled and event logging configured.
25) Once inputs are configured, data should be populated in the mscs:storage:table:evtx sourcetype.
And there we have it. We've configured Azure storage tables, blobs, VM resources and resource metrics.
Stay tuned for Part 3 where we'll dive into Exchange Online and email logging!
Got a Microsoft Cloud data source you'd like to see covered? Comment below or hit me at ry@splunk.com.
Happy Splunking!
The Splunk platform removes the barriers between data and action, empowering observability, IT and security teams to ensure their organizations are secure, resilient and innovative.
Founded in 2003, Splunk is a global company — with over 7,500 employees, Splunkers have received over 1,020 patents to date and availability in 21 regions around the world — and offers an open, extensible data platform that supports shared data across any environment so that all teams in an organization can get end-to-end visibility, with context, for every interaction and business process. Build a strong data foundation with Splunk.