Splunk Delivers Real-Time Operational Intelligence to Newly Announced AWS IoT Service

The big news of the day is that AWS is officially in the Internet of Things business. This is an announcement that is bound to make a major impact on the IoT, and is exciting news for those who are building IoT solutions as they can now take advantage of the benefits AWS can provide in terms of time-to-value, security, and scalability for device to cloud and cloud to device infrastructure.

This is also great news for Splunk’s IoT team. Our existing partnership with AWS provided a great starting point for making Splunk Enterprise AMIs and Splunk Cloud an easy to deploy and use solution for the massive amounts of machine data bound to be created by AWS IoT applications. We’ve worked with the AWS IoT team to make sure that Splunk software is ready to go for your AWS IoT environment.

We think this partnership is great news for the IoT as well. As we announced at .conf2015 last month, Splunk Enterprise 6.3 is our best platform yet for bringing value to data generated by the IoT. Splunk’s core ability to turn machine data into valuable insights will add great value to applications built in AWS IoT for both developers and end-users.

flowchart Splunk machine data

We’ll be putting out some technical content around the “how-to” with AWS IoT, but the goal here is to explain to you what Splunk software + AWS IoT can bring to your IoT solution, and to show you just some of the exciting analytics and visualizations we bring to the table for any data from any source. The dashboards and data we’ll be using as examples here were built together with the AWS IoT team, and are being demonstrated live at AWS re:Invent. If you are there, be sure to stop by the Splunk booth, and they can direct you to the nearest demonstration!

Diagnostic, Transaction and Sensor data (oh, my!)

First thing to understand is that our integration with AWS IoT is built upon our new HTTP Event Collector, which allows high velocity (millions of events per second) streaming of all kinds of structured, semi-structured, and even unstructured data to Splunk from many (potentially millions) of connections. There’s a great overview of the HTTP Event Collector on our blog. The gist is that you now deliver token-authenticated events, messages and key value pairs of data directly to Splunk. This new technology allows scalable and secure data collection from AWS IoT applications as well as from many other data sources like Xively, Octoblu, and Docker, or even your own custom applications. By integrating the HTTP Event Collector with Amazon Lambda, we’ve made available a plug-and-play pathway to deliver real-time streaming data from AWS IoT. AWS IoT devices and shadow objects in AWS IoT run rules that pipe data to Lambda, and now that data can easily be forwarded on to Splunk. So what to send?

Diagnostics: Your AWS IoT devices and applications will be running code. Lots of code. And developing, testing, deploying and maintaining large scale, highly distributed and interconnected applications is extremely challenging, especially when they are running on thousands or even millions of widely-deployed devices. Instrument your AWS IoT code and deliver the logs to Splunk, however, and you can start to better understand how your applications are operating in the real world. Take a look at what Splunk has to offer for application delivery and developer operations, and imagine what those capabilities can bring to a complex IoT environment. But don’t take my word for it, check out this session presented at .conf2015 by Orion Labs on this very subject.

Transactions: If it’s the IoT, people are going to be interacting with your devices, your devices may interact with each other, and ultimately (hopefully in the friendly robot, not scary robot sense) your devices will be interacting with humans and the rest of physical world. Automation process steps, purchases, keypad entries and more can create a machine data event at the moment they occur. Each one of these interactions can be logged to Splunk, and the more information that is included in the event, the more information there is to build a better understanding of the real-world behaviors of your devices and customers. Hear more about how Coca-Cola uses Splunk software to analyze transactional data coming from vending machines and Freestyle machines in this clip from .conf2014…

… and if you didn’t happen to click the “friendly robot” link above, here’s what Target is doing with Splunk to analyze the process transactions from their distribution center automation and robotics systems. (Don’t click the “scary robot” link).

Sensor Data: IoT devices interact with the physical world, and use sensors to gather information such as temperature, pressure, humidity, light, and even sound and electrical measurements such as decibels, watts and amps. Delivering real-time sensor data to Splunk software allows that data to be used in both real-time and historical statistical and time-series analytics, and sensor readings can be used to detect anomalies in environmental conditions and alert or take action. Use Splunk’s powerful search processing language to enrich sensor data with metadata from sources outside of AWS IoT, and run ad-hoc correlations of sensor data with other unstructured and semi-structured data in your environment. You can now perform powerful real-time statistical and time-series analytics on data across all of your applications and services, including AWS IoT. Need advanced analytics? Take a look at the new Machine Learning Toolkit and Showcase App on Splunkbase as well to see how you can easily integrate powerful ML libraries with SPL to analyze the sensor data generated by your applications and devices. We had several customers and partners, including Denver Water, Robotron, and Infigo present at .conf2015 on how Splunk software has helped them tame the sensor data in their industrial and IoT environments, check out their presentations for real-world examples of the value Splunk software can bring to the sensor data landscape.

Finally, lets take a look at the demo we built with the AWS IoT team for AWS re:Invent to see how this all can come together. This demo was built to demonstrate the collection, delivery, storage, analysis, and visualization of data coming from hypothetical AWS IoT connected wildfire sensors deployed in a fictional forest fire zone with Splunk software.

The raw data coming from AWS IoT is perfect for analyzing with Splunk software as it meets two important requirements – its time-stamped and human readable. Lets take a look at a raw AWS IoT event from this demo ingested by Splunk Cloud:

{ [-]
battery: 75
gas: 220
humidity: 40
lat: 36.3408156345
lon: -118.481066513
sensorId: SENSOR167
state: 0
temp: 333.85
time: 1442425044.13

Important information delivered includes a timestamp (the time the event was generated and the sensor data was collected), a device identifier (which device sent the information) a number of sensor readings like gas, humidity, and temperature, and even some diagnostic information like battery level and state. In this one simple event we capture a solid picture of the operations of a device in the field at a particular moment in time. And in case you thought I was going to skip right over it, some of the most valuable information contained in the event is in those lat and lon fields. This is the IoT, folks, and data not only has a time dimension, but also a location dimension. Stats over time is cool, but stats by location is amazing, and can allow some powerful visualizations like this burn map created with the new geo-lookup and chloropleth mapping features we just released in Splunk Enterprise 6.3. To find out more about how this works, check out this blog by Michael Porath.

Splunk Enterprise wildfire intensity map

If you happen to be at AWS re:Invent, you likely saw one of the most amazing visualizations of all – sensor data by location over time – and here’s a beautiful (albeit frightening) playback of this sensor data, produced and delivered by AWS IoT, ingested, stored, enriched, and analyzed by Splunk, and visualized using CartoDB. If you are interested in using CartoDB visualizations right from Splunk software, be sure to check out CartoDB’s Torque App for Splunk.

So if you’re not at AWS re:Invent, here’s some additional views into how it all came together:

Using machine data to monitor fictional wildfire progression

Using machine data to monitor fictional wildfire progression

Splunk temperature individual sensor readings

Individual sensor readings for hundreds of sensors

Splunk wildfire flashpoint geo-analytics

Flashpoint geo-analytics with Splunk (first sensor out of hundreds to hit 175ºC)

Full operational view into device and sensor operations

Full operational view into device and sensor operations

So take a look at AWS IoT and fire up (no pun intended) a Splunk 6.3 AMI. I’d love to hear about what you accomplish with AWS IoT and Splunk, and as always, feel free to reach out to me. All contact info below.



Brian Gilmore
Posted by

Brian Gilmore

Brian currently focuses on inspiring and enabling Splunk’s partners and 18,000+ worldwide customers to use the data from connected devices and assets to improve availability, performance and security in their businesses. With a career path that has spanned music, hospital administration, marine biology, and industrial automation, he's really happy he hasn't run out of industries who want to use data more effectively to improve both business outcomes and the human condition.

Join the Discussion