How to Splunk Data from a Raspberry Pi – Three Easy Steps!

Last week we announced a release of our Universal Forwarder for Linux ARM (Raspberry Pi) which greatly simplifies the process of taking any data from a Raspberry Pi including GPIO or other connected sensors and attached shields and sending that data to Splunk. While this task was not impossible before, the process involved manual streaming of data to the Splunk API. Using a Universal Forwarder you can take advantage of Splunk’s end to end data distribution mechanisms with no need to understand or develop against Splunk’s API.

You can also take advantage of Splunk’s built in remote management and maintenance through its distributed deployment features. With these features you could update inputs and manage scripts on thousands of distributed Pi’s, collecting all of their data remotely and forwarding it to a central Splunk indexer. The potential there is only limited to your imagination and we will cover scenarios like this in future posts, but for now we will focus on building a super simple data connection which will send CPU data from a single network connected Pi to a single instance of Splunk running on a network connected laptop or desktop.

Step 1: Install and configure Splunk on your laptop or desktop.

First, download Splunk 6 from our download site. Install using the documentation and default settings. Once Splunk is installed, you should open a browser and go to http://localhost:8000. You will be presented with a login page. For the first login use the username “admin” and password “changeme”.

Once you login, you will see a popup describing new features in Splunk 6. Close this out and you will be ready to configure!

In the upper right, click the dropdown for “Settings”. Under Data, click Forwarding and receiving, and you will be taken to the configuration page where you can set Splunk to listen for data from your Pi.

Click “configure receiving”, and you will be taken to the receive data configuration page. Assuming this is a brand new installation of Splunk, you will have no configurations. Click “New” and you will be taken to the new configuration wizard. For now, we will just add a new listener at port 9997, and click Save.

Splunk is now listening for data from a Splunk Universal Forwarder on port 9997. You will need to make sure that the machine your Splunk instance is running on has the appropriate firewall ports open. Congratulations, your Splunk instance is ready to receive data from your Pi!

Step 2: Install the Splunk Universal Forwarder on your Raspberry Pi.

Download the Universal Forwarder from https://splunkbase.splunk.com/app/1611/ to your Pi. Despite the “installation instructions” mentioned during the download process, this distribution is actually a Splunk installable, and NOT a Splunk app. For my Pi, running Raspian, I used (word for word) http://docs.Splunk.com/Documentation/Storm/Storm/User/Setupauniversalforwarderonnix and it worked out great. One important thing to know when installing the Universal Forwarder on *nix, is that the default install does NOT autorun on boot. You can set it to autostart running the following as root: $SPLUNK_HOME/bin/Splunk enable boot-start, but please see the documentation at http://docs.Splunk.com/Documentation/Splunk/6.0/Admin/ConfigureSplunktostartatboottime for more detailed instructions.

There are all kinds of tweaks etc. you can run on the forwarder, but for simplicity’s sake we will leave everything as default. To start Splunk on your forwarder, navigate to $SPLUNK_HOME /bin/ and run ./splunk start. You’ll see the standard output for startup. At the next prompt, run ./splunk version, and you should see the version output for ARM Linux. Congratulations, the Splunk Universal Forwarder is running on your Raspberry Pi!

Step 3: Configure scripts and Splunk Universal Forwarder .conf files to send data to Splunk.

IMPORTANT: These directions are a quick start, not best practices. For complete information, including best practices, please refer to the complete documentation at http://docs.Splunk.com/Documentation/Splunk/6.0/Data/Setupcustominputs#Add_a_scripted_input_via_inputs.conf.

Configuring the Pi to send data to your laptop is as easy as adding a script to a directory, and editing two configuration files. That’s really it. For this example I’ll demonstrate how I wrote a simple bash script to query the system for CPU temperature and voltage. One thing I made sure of when writing the script, is that I was formatting the data in a way that Splunk would easily be able to automatically extract timestamp and field information. While this is not completely necessary, because I’m writing this script to send data specifically to Splunk, it’s really a no brainer as it will save me additional configuration file edits and field extractions on the indexer. For more information on this type of logging, please see our logging best practices http://dev.splunk.com/view/logging-best-practices/SP-CAAADP6.

So here is my simple script (cpu.sh), I created it in $SPLUNK_HOME/bin/scripts:

#! /bin/sh
for c in $(seq 1 60)
do
TIME=$(date +"%D %T.%3N %Z")
TEMP=$(vcgencmd measure_temp)
VOLTS=$(vcgencmd measure_volts)
CPU_TEMP=`echo $TEMP | cut -d \' -f 1`
VOLTS=`echo $VOLTS | cut -d V -f 1`
echo "time=$TIME|$VOLTS|$CPU_TEMP"
sleep 1
done

Complex, huh? Not really. Here’s the thing though, this script can be anything, as simple or as complex as you need it. As long as it will run from a command line (the universal forwarder uses environmental variables to run the scripts), any information printed to the standard output will get sent to the Splunk indexer (after we configure the conf files to schedule the script and send the data). So echo is all we need in this case to push ANY data from Pi to our indexer!

We now need to schedule our script and configure the forwarder to forward our data. In $SPLUNK_HOME/etc/system/local/inputs.conf, I’ve edited the file to read (based on my Splunk install):

[default]
host = raspberrypi
[script:///opt/Splunkforwarder/bin/scripts/cpu.sh]
index = main
interval = 60
source = cpu_details
sourcetype = pi_system_scripts

For detailed information, please see http://docs.Splunk.com/Documentation/Splunk/6.0/Data/Setupcustominputs#Add_a_scripted_input_via_inputs.conf, but basically we have directed the universal forwarder to create an input of type script, directed the input to the specific script, and then set a few parameters, including directions for the indexer (what index to store the data in), the frequency to rerun the script (60 seconds), and predefined the sourcetype and source.

Finally, we need to configure outputs.conf to send the data to our specific indexer. I’ve edited the outputs.conf file in $SPLUNK_HOME/etc/apps/SplunkUniversalForwarder/default, but again, definitely take a look at the documentation for best practices.

#   Version 6.0
[tcpout]
defaultGroup=my_indexers
[tcpout:my_indexers]
server=XX.XXX.XX.XXX:9997

Where XX… is the IP of my indexer. So with a quick restart of the universal forwarder on the pi with the command ./splunk restart, we are getting data on our indexer. That’s all it takes! In summary:

Hope this helps you get your Pi to Splunk project up and running. Please experiment. Keep in mind that you can use this same process to send any data from many Pi’s to a single indexer, scalability and flexibility are what Splunk’s Universal Forwarder is all about. If you have any questions or comments, please send me a tweet at @BrianMGilmore, and keep an eye out for future blogs on working with Splunk and data from the Raspberry Pi!

----------------------------------------------------
Thanks!
Brian Gilmore

Related Articles

Technology Modernization Fund: An Opportunity to Upgrade Legacy Capabilities
Industries
2 Minute Read

Technology Modernization Fund: An Opportunity to Upgrade Legacy Capabilities

Federal agencies should seek additional funding for system upgrades and new capabilities from the Technology Modernization Fund. While historically underused, the $1 billion fund can help get agencies off of insecure and inefficient legacy systems and onto state-of-the-art technology, such as cloud and security automation.
Department of Defense Designates Splunk a Core Enterprise Technology
Industries
2 Minute Read

Department of Defense Designates Splunk a Core Enterprise Technology

The Department of Defense (DOD) designated the Splunk Enterprise Software Initiative (ESI) Blanket Purchase Agreement (BPA) as a Core Enterprise Technology Agreement (CETA).
How 5G Can Elevate the Customer Experience: Telecoms
Industries
1 Minute Read

How 5G Can Elevate the Customer Experience: Telecoms

Cross industry whitepaper focusing on how 5G can help both Retailers and Telecoms elevate their customer experience.
Sharing a Vision for Data-Driven Government - a New Splunk Whitepaper
Industries
1 Minute Read

Sharing a Vision for Data-Driven Government - a New Splunk Whitepaper

The amount of data that UK government will generate will increase significantly over the next few years, but it will present it with an opportunity to understand its citizens better and improve service and policy delivery. To do this the UK Government should consider taking a data-driven approach, by using data in new ways and overcoming some of the inherent blockers to using data more effectively.
Accelerated Data Science and Deep Learning for a Predictive Testing Strategy
Industries
2 Minute Read

Accelerated Data Science and Deep Learning for a Predictive Testing Strategy

During .conf20 we presented alongside BMW Group the way a predictive testing strategy can enable better process efficiency in automotive manufacturing. We also introduced briefly which machine learning tools and analytical techniques were useful within the given situation.
As Funding Cuts Arrive, Can CDM Deliver on the Value of Its Promises?
Industries
3 Minute Read

As Funding Cuts Arrive, Can CDM Deliver on the Value of Its Promises?

Cybersecurity is a tough job done. But when funding allocations suddenly shift after essential programs are already under development, it can be hard for even the most creative, resilient CIOs and CISOs to keep up.
Defense Department Cybersecurity: All Ahead on Zero Trust
Industries
2 Minute Read

Defense Department Cybersecurity: All Ahead on Zero Trust

The Defense Department’s embrace of a Zero Trust architecture is critical to maintaining the security of its networks in a post-pandemic environment.
UK Public Sector Predictions for 2021: Part 2
Industries
2 Minute Read

UK Public Sector Predictions for 2021: Part 2

In this blog, Senior Director, EMEA Government Affairs Gordon Morrison takes a closer look at Splunk's five key technology predictions for the UK public sector.
DoD’s Cyber Posture: A Focus on Automation
Industries
2 Minute Read

DoD’s Cyber Posture: A Focus on Automation

The importance of the security of the Department of Defense’s (DoD’s) networks is no secret (well, of course a lot of it is secret!). This is evidenced by the Department’s IT/Cybersecurity budget request that annually tops $40 billion dollars. Last year’s IT and Cyberspace Activities Budget Overview perhaps said it best: “Successful mission execution is contingent on a seamless, secure infrastructure that transforms data into actionable information and ensures dependable mission execution in the face of the persistent cyber threat.”