CLOUD

Ready, Set, Stream with the Kinesis Firehose and Splunk Integration

It's official! Kinesis Firehose integration with Splunk is now generally available. With this launch, you'll be able to stream data from various AWS services directly into Splunk reliably and at scale—all from the AWS console.

This integration complements the existing data ingestion capabilities of Splunk Add-on for Amazon Web Services and Lambda Blueprints for Splunk, and brings a wide range of additional advantages that include:

  • Fully managed service with serverless architecture: don't worry about managing and scaling your data collection nodes - this is all managed for you.
  • Greater reliability and scalability by leveraging HTTP event collector indexer acknowledgement and natively handling back pressure by allowing you to persists undelivered data to an S3 bucket.
  • Well integrated with various AWS data sources such as VPC Flow Logs, CloudWatch logs, CloudWatch Events and AWS IoT.
  • Easy to use with no programming requirement. 
  • Ability to transform raw data prior to sending it to Splunk: use lambda as needed to normalize the data prior to indexing it in Splunk.
  • Simplified architecture: bypass the need for setting up and managing forwarders by streaming your data directly to Splunk indexers.

 

Architecture Highlights

This is a fully managed ingestion—no need for you to worry about operational overhead of setting up data collection nodes. If you want to scale out, just add as many HTTP event collector nodes behind a load balancer and off you go. This architecture allows you to stream the data directly to your Splunk indexing tier whether you are using Splunk Cloud or Splunk Enterprise. This integration leverages HTTP Event Collector indexer acknowledgement for greater reliability.

 

How can I get started?

This is already available for you to use from your AWS console. Whether you are on Splunk Cloud or using Splunk Enterprise, just download the Splunk Add-on for Kinesis Firehose, deploy it on your Splunk cluster, and you're ready to start your configuration.

For more details on configuration steps, please refer to:

Please give this integration a try. We're happy to hear your feedback, and happy Splunking!

Elias Haddad
Posted by

Elias Haddad

Elias is an Emerging Market Presales Architect working out of the Dubai office. Prior to that, he was a Product Manager responsible for Splunk data ingestion and held various pre-sales, post-sales and business development positions. Elias lives in Dubai and graduated from Purdue University with a master’s degree in computer engineering.