Splunking F1: Part One

Here at Splunk, we are always on the lookout for new and exciting sources of data to get our hands on. When an opportunity to demonstrate Splunk to a prominent Formula One team came along, it really motivated us to search for a relevant data set to tailor the value of Splunk. The suggestion of Formula One racing simulators came from a conversation with an exemplary individual who I will refer to as Dave. Dave, a keen Formula One enthusiast, had identified a new capability in the F1 2016 PS4 game. After discovering telemetry data could be sent via UDP to third party applications, Dave had embarked on a personal project to consume and analyse this data in Splunk.

How it works

Racing simulators have evolved considerably in recent years, adding real-world variables such as fuel usage, damage, tyre properties, suspension settings and more. F1 2016 introduced the feature to expose such metrics via UDP to external devices such as D-BOX, motion platforms, steering wheels and LED devices. The game can be configured to broadcast real-time telemetry data every tenth of a second - equivalent to that of a real-world F1 car - to the local network subnet, or to send UDP traffic to a specific host and port. Each UDP packet sent includes a char array containing the telemetry data in binary format. Splunk as a machine data platform is well equipped to take advantage of the plethora of data on offer, thus providing the basis for an exciting new analytics project.

Any data can be brought into Splunk, but it needs to be in a textual, human readable format for us to comprehend it. To intercept and decode the UDP traffic, we implemented a simple Splunk modular input to listen on a socket, unpack the char array, reformat the data as CSV, and write it to Splunk via the Python SDK. CSV is particularly efficient as it minimises the raw event size and Splunk can easily learn the context of the dataset.

We were able to save significant time and effort by using the Splunk Add-on Builder. The tool helps developers configure data inputs, create a setup page, and ensure adherence to best practices, rather than having to manually edit and manage Splunk configuration files. When building modular inputs, it provides a series of helper classes which further simplify the effort involved.

All in all, including the copious amounts of "testing" of the F1 2016 game, we completed the data ingestion component of the project within a day. We will be publishing the TA on Splunkbase in the near future; in the meantime the source is available on Github.

Splunk Live! F1 Challenge London

As with many types of data in Splunk, you typically find that the same data can be used in a variety of different ways, and for different audiences - each use case defined by the lens we place on the data. Our project commenced as a straightforward demonstration of real-time ingestion of the F1 telemetry data, with a sequence of dashboards to analyse the race data. The opportunity then presented itself to use the F1 data for a different purpose at this year's SplunkLive! London and Paris events.

Stay tuned for part two of this blog to discover how the data unravelled the unlikely event of a tie at SplunkLive! London.

SplunkLive f1 challenge leaderboard

----------------------------------------------------
Thanks!
Jon Varley

Related Articles

Introducing Splunk Operator for Kubernetes 2.0
Platform
2 Minute Read

Introducing Splunk Operator for Kubernetes 2.0

Learn about the newest features in the evolution of our Splunk Operator App Framework.
The Convergence of Security and Observability: Top 5 Platform Principles
Platform
3 Minute Read

The Convergence of Security and Observability: Top 5 Platform Principles

Bringing together security and observability into one holistic platform raises the technical focus of ITOps, DevOps and Security to the broader business concern of managing risk.
Welcome to the Future of Data Search & Exploration
Platform
3 Minute Read

Welcome to the Future of Data Search & Exploration

Introducing the new SPL2 Search Experience for Splunk Cloud, accelerating the data-to-insight workflow, and bringing the power of Splunk to everyone – learn more here.
Splunk 9.0 SmartStore with Microsoft Azure Container Storage
Platform
4 Minute Read

Splunk 9.0 SmartStore with Microsoft Azure Container Storage

With the release of Splunk 9.0 came support for SmartStore in Azure. Previously to achieve this, you’d have to use some form of S3-compliant broker API, but now we can use native Azure APIs.The addition of this capability means that Splunk now offers complete SmartStore support for all three of the big public cloud vendors. This blog will describe a little bit about how it works, and help you set it up yourself.
Machine Learning at Splunk in Just a Few Clicks
Platform
4 Minute Read

Machine Learning at Splunk in Just a Few Clicks

Explore three new beta applications introduced at .conf22 that simplify complex and time consuming tasks while lowering barriers for customers to unlock the power of ML in everyday workflows.
Dashboard Studio: Level-Up Your App with Dashboard Studio
Platform
2 Minute Read

Dashboard Studio: Level-Up Your App with Dashboard Studio

We reimagined the dashboards in the Microsoft 365 App for Splunk using Dashboard Studio, and you can too!
Data Manager Enables Microsoft Azure Data Onboarding!
Platform
2 Minute Read

Data Manager Enables Microsoft Azure Data Onboarding!

We're excited to share that Data Manager now supports the onboarding of Microsoft Azure data sources, allowing you to use the same Data Manager application in your Splunk Cloud Platform to onboard critical Azure data sources to generate actionable insights in Splunk.
Dashboard Studio: More Maps & More Interactivity
Platform
3 Minute Read

Dashboard Studio: More Maps & More Interactivity

Get a closer look at the expanded interactivity capabilities and visualizations for Dashboard Studio, including more drill-down and interactivity options, more maps, more configuration options.
Deep Learning Toolkit 3.7 and 3.8 - What’s New?
Platform
3 Minute Read

Deep Learning Toolkit 3.7 and 3.8 - What’s New?

We are excited to share the latest advances around the Deep Learning Toolkit App for Splunk (DLTK). These include custom certificates, integration with Splunk Observability and a container operations dashboard, just to name a few.