Splunking F1: Part One

Here at Splunk, we are always on the lookout for new and exciting sources of data to get our hands on. When an opportunity to demonstrate Splunk to a prominent Formula One team came along, it really motivated us to search for a relevant data set to tailor the value of Splunk. The suggestion of Formula One racing simulators came from a conversation with an exemplary individual who I will refer to as Dave. Dave, a keen Formula One enthusiast, had identified a new capability in the F1 2016 PS4 game. After discovering telemetry data could be sent via UDP to third party applications, Dave had embarked on a personal project to consume and analyse this data in Splunk.

How it works

Racing simulators have evolved considerably in recent years, adding real-world variables such as fuel usage, damage, tyre properties, suspension settings and more. F1 2016 introduced the feature to expose such metrics via UDP to external devices such as D-BOX, motion platforms, steering wheels and LED devices. The game can be configured to broadcast real-time telemetry data every tenth of a second - equivalent to that of a real-world F1 car - to the local network subnet, or to send UDP traffic to a specific host and port. Each UDP packet sent includes a char array containing the telemetry data in binary format. Splunk as a machine data platform is well equipped to take advantage of the plethora of data on offer, thus providing the basis for an exciting new analytics project.

Any data can be brought into Splunk, but it needs to be in a textual, human readable format for us to comprehend it. To intercept and decode the UDP traffic, we implemented a simple Splunk modular input to listen on a socket, unpack the char array, reformat the data as CSV, and write it to Splunk via the Python SDK. CSV is particularly efficient as it minimises the raw event size and Splunk can easily learn the context of the dataset.

We were able to save significant time and effort by using the Splunk Add-on Builder. The tool helps developers configure data inputs, create a setup page, and ensure adherence to best practices, rather than having to manually edit and manage Splunk configuration files. When building modular inputs, it provides a series of helper classes which further simplify the effort involved.

All in all, including the copious amounts of "testing" of the F1 2016 game, we completed the data ingestion component of the project within a day. We will be publishing the TA on Splunkbase in the near future; in the meantime the source is available on Github.

Splunk Live! F1 Challenge London

As with many types of data in Splunk, you typically find that the same data can be used in a variety of different ways, and for different audiences - each use case defined by the lens we place on the data. Our project commenced as a straightforward demonstration of real-time ingestion of the F1 telemetry data, with a sequence of dashboards to analyse the race data. The opportunity then presented itself to use the F1 data for a different purpose at this year's SplunkLive! London and Paris events.

Stay tuned for part two of this blog to discover how the data unravelled the unlikely event of a tie at SplunkLive! London.

SplunkLive f1 challenge leaderboard

----------------------------------------------------
Thanks!
Jon Varley

Related Articles

Dashboard Studio: Small Changes, Big Impact
Platform
2 Minute Read

Dashboard Studio: Small Changes, Big Impact

Level up your dashboarding experience with the enhancements we've made in Splunk Cloud Platform 9.3.2411.
Dashboard Studio: Your Dashboards, Now Guest-Friendly
Platform
2 Minute Read

Dashboard Studio: Your Dashboards, Now Guest-Friendly

Learn how to share your Studio dashboards with the Publish Dashboard feature.
From Setup to Migration: Azure Event Hubs in Data Manager
Platform
4 Minute Read

From Setup to Migration: Azure Event Hubs in Data Manager

Seamlessly integrate Azure Event Hubs with Splunk Data Manager and enhance your data management with efficient, secure, and scalable solutions.
Clara-fication: Data Onboarding Best Practices
Platform
11 Minute Read

Clara-fication: Data Onboarding Best Practices

The Splunk Security Center of Excellence outlines best practices they follow to bring data into Splunk.
Introducing Cloud Monitoring Consoles’s New Overview Dashboard: Intuitive Actionable Insights at Your Fingertips
Platform
3 Minute Read

Introducing Cloud Monitoring Consoles’s New Overview Dashboard: Intuitive Actionable Insights at Your Fingertips

Splunk's Cloud Monitoring Console (CMC) 3.35.0 introduces a new Overview dashboard, centralizing key metrics, highlighting issues, and offering tools to help admins optimize performance and gain faster insights.
Splunk Platform Use Cases, Written Just for You
Platform
2 Minute Read

Splunk Platform Use Cases, Written Just for You

Learn how to find use case guidance for the Splunk platform for key industries such as energy, manufacturing, finance, and retail.
Automatic Deprovisioning of users for Okta IdP
Platform
2 Minute Read

Automatic Deprovisioning of users for Okta IdP

With the release of the feature, Splunk customers can automatically deprovision users within Splunk when a user(s) are removed from the customer’s Okta Identity Provider (IdP).
Enhanced Maintenance Experience in Splunk Cloud Platform
Platform
3 Minute Read

Enhanced Maintenance Experience in Splunk Cloud Platform

Splunk is excited to announce a significant upgrade to our maintenance experience, designed to provide you with greater control and minimal impact.
Unlock the Value of Cloud: Introducing Splunk Cloud Value Calculator
Platform
2 Minute Read

Unlock the Value of Cloud: Introducing Splunk Cloud Value Calculator

Splunker Deepak Belani explains how our new Splunk Cloud Value Calculator can calculate how much admin effort you can save by using Splunk Cloud Platform.