CLOUD

Serving It Up with AWS and Splunk: AWS Serverless Application Repository Now Available

In November, we launched new serverless applications at AWS re:Invent 2017 for the AWS Serverless Application Repository. As of February 20, 2018, the AWS Serverless Application Repository is no longer in private preview! To celebrate and help you better understand how to easily ingest AWS data into Splunk, let’s review the benefits.

So…Why a Serverless Application?

Each of Splunk’s serverless applications is designed to enable you to deploy all the AWS infrastructure needed to start streaming a variety of AWS data sources into Splunk in a scalable, straightforward and automated manner.

To break it down further, let’s look at one of Splunk’s serverless applications in particular—splunk-kinesis-stream-processor.

Below is the full list of applications that are currently available:

  1. splunk-logging: Log events from AWS Lambda itself to Splunk’s HTTP event collector
  2. splunk-dynamodb-stream-processor: Stream Amazon DynamoDB events to Splunk’s HTTP event collector
  3. splunk-elb-application-access-logs-processor: Stream Application Load Balancer access logs from Amazon Simple Storage Service (Amazon S3) to Splunk’s HTTP event collector
  4. splunk-elb-classic-access-logs-processor: Stream Classic Load Balancer access logs from Amazon S3 to Splunk’s HTTP event collector
  5. splunk-iot-processor: Stream AWS IoT Core events to Splunk’s HTTP event collector
  6. splunk-kinesis-stream-processor: Stream events from Amazon Kinesis Video Stream to Splunk’s HTTP event collector

Show Me!

Let’s review how to use one of our serverless applications to quickly set up AWS data ingestion into Splunk using AWS Kinesis Stream. By the end of this blog post, we'll have added the capability to stream security findings from AWS GuardDuty to visualize in AWS GuardDuty Add-on for Splunk. Even if you aren’t familiar with or don’t use AWS GuardDuty, you’ll experience a streamlined setup for streaming your AWS data into Splunk using a serverless application.

Let’s use the visual above to help you identify the best way to ingest AWS data into Splunk. As you can see, the data flow from AWS GuardDuty to Splunk is set up mainly by the serverless application. As you follow the paths in the visual, you can see that data is streamed from any AWS service that uses AWS Kinesis Stream in the ingestion path to Splunk. You can follow the steps in this demonstration to stream data from any AWS service that uses Kinesis Stream as the ingestion mechanism into Splunk.

Ok, Let’s Begin…

Let’s start with “dessert” first, which is deploying the serverless app and watching a scalable data streaming architecture generate while you sip on a beverage of your choice. A serverless app from the Serverless Application Repository is an AWS CloudFormation template that leverages the Serverless Application Model (SAM) to make development and readability a breeze. The Serverless Application Repository integrates with Lambda, so you can launch serverless apps from the AWS Lambda console. Just create a function and select “Serverless Application Repository” from the different template options. Then, search for “Splunk” in the search bar and select the “splunk-kinesis-stream-processor” serverless application.

But First…

There are a couple prerequisites. Before deploying the serverless application, you will need to fill out the parameters, which include an AWS Kinesis Stream and an AWS CloudWatch Event rule to capture events from AWS GuardDuty. If you want to visualize the GuardDuty findings in Splunk, you will need to install the AWS GuardDuty Add-on. Finally, you will need to create an HTTP Event Collector (HEC) token on your Splunk server with a default sourcetype of “aws:cloudformation:guardduty”. If you installed the GuardDuty Add-on, this sourcetype will already exist, otherwise you will need to create it.

Note: You should check to make sure HEC and the newly-created HEC token are enabled on the Splunk server!

Now that the prerequisites are finished, what’s next? We are using AWS Kinesis Stream to stream data, but the first step is to actually create the Kinesis Stream. It’s a simple process (Kinesis console > “Create data stream”), but remember the name chosen for this stream because it will be needed later.

To capture findings from GuardDuty, you will need a CloudWatch Event Rule which can be configured via the CloudWatch console (CloudWatch console > “Events” > “Rules” > “Create Rule”). For the target of this rule, use the Kinesis Stream you created earlier.

Let’s go back to where we started and finish deploying the serverless application. With the Kinesis Stream and your HEC token created, you have everything needed to fill out the parameters and deploy!

Remember how we started with “dessert?” Turns out we’re finishing up with some more dessert! Now that we have everything set up, let’s have a look at the events streaming in from GuardDuty in the Splunk GuardDuty Add-on. Other than installing the add-on and creating the HEC token, no additional configuration is required for Splunk. You can search for GuardDuty findings using Splunk search, or head over to the starter dashboards provided with the add-on.

Try generating some sample findings from the settings page in the AWS console for GuardDuty. After these sample findings are received by Splunk, you should see a nightmare dashboard in the GuardDuty Add-on warning you of rogue Bitcoin activity, communication with command & control servers, and other nefarious activity across your AWS environment. Don’t worry though, hopefully these are just the sample findings!

The Splunk GuardDuty Add-on is not fully featured without the Splunk App for AWS and Splunk Add-on for AWS installed and configured, however you will still be able to visualize and search the GuardDuty data without them.

So, there you go—in just a couple minutes, you're streaming and visualizing live data from AWS in Splunk. This is only one of the several Splunk serverless applications that exist, so give them a try to see how efficiently you can visualize AWS data in Splunk!

Nic Stone
Posted by

Nic Stone

Nic Stone works as a Solutions Architect at Splunk for the Global Strategic Alliances partner integration team. He has built a number of integrations and tools for Splunk partners and customers and has a background in security and computer architecture research. He holds a B.S. in Computer Science (sometimes he puts it down too).

Join the Discussion