Welcome to the Future of Data Search & Exploration

You have more data coming at you than ever before. Over the next five years, the total amount of digital data is going to be more than twice the amount of data created since the advent of digital storage.

With the success of your company often determined by how you anticipate and respond to threats – and leverage meaningful insights – you need the ability to quickly search and find insights in your data, despite this increasing deluge of information.

Simultaneously, you need to empower more users than ever before to access and find insights in their data, regardless of their skill level or technical acumen.

And despite more data, more users and faster queries, you need tools that are ever more cost efficient, to meet IT budgets that seldom scale to data storage and demand.

Introducing a Revolutionized Search Experience

Splunk is uniquely positioned to ride this data wave, with users performing billions of monthly ad-hoc queries against data of all structures and sizes.

Working with numerous customers and drawing upon 20+ years of historical feedback, Splunk is pleased to announce the Public Preview of a complete redesign of its core Search experience, accelerating the data-to-insight workflow, and bringing the power of Splunk to everyone.

Faster Time-to-Insight for Advanced Investigations

Robust IDE-Style Code Editor

With the new Search experience, you’re no longer confined to a small search box to write your queries. With a resizable IDE-style editor, you have all the space you need to write complex searches.

Updated and More Powerful Search Language (SPL2)

The new editor supports our updated Search Processor Language (SPL2), a more concise and powerful version of our search language. SPL2 is designed for users with a range of query language backgrounds including SPL and SQL while maintaining a similar syntax for familiarity. It also introduces a range of powerful capabilities such as built-in functions, types and the ability to add comments.

If you’ve never written SPL searches before – or occasionally forget which command to use or the correct syntax – auto-completion and in-line documentation make it fast and easy to start writing searches.

Thanks to the power of SPL2, you can now perform multiple searches all within a single Search Module. It’s extremely common for individuals to want to iteratively build searches, or even branch investigations. For instance, after preparing a view of their data, they may wish to conduct a certain set of analysis with one set of queries and conduct separate analysis on the same data with a different set of queries. This previously resulted in a proliferation of browser tabs. However, with the new search experience, a user can take the output of one search and use it as a starting point for another search, all within the same experience!

Reusable Searches

Every search query you write can be exported as a reusable dataset, making it easy to share specific views of your data.

Integrated Charting

The new Search experience also supports multiple charts. Instead of having to delete and re-write different commands to see various visualizations of your data, it’s easy to point-and-click and create as many charts as you’d like from the same base search.

The Power of Splunk for Everyone

Point-and-Click Search Creation

The ability to search through structured and completely unstructured data is now available to everyone! Users can create complex queries on the fly using easy point-and-click interactivity. Everything from creating filters, to adding fields to display, to building charts, can be done directly through the interface, without ever needing to write SPL2.

Bi-Directional Editing

While Splunk has offered point-and-click tools in the past, such as our extremely popular Table Views, this new Search experience is completely bi-directional. A first-time user can generate a query using point-and-click actions, while an advanced user can edit that query using the code editor. Both users can seamlessly collaborate on the same query.

Save and Share

Once a user has finished searching their data, their entire multi-search statement investigation, including charts, can be saved as a Search Module for easy reuse and collaboration. Search Modules can be saved to a user's private workspace, or saved to a shared workspace where they can collaborate with others.

Get Started Today!

In the words of Aaron Dobbins, President of Ellenby Technologies, “The new Search experience for Splunk Cloud helps us to iterate through questions we have hidden in our data more efficiently than ever before, all without needing deep knowledge of the SPL language.”

Splunk’s new Search experience is now available as public preview, at no extra cost, to Splunk Cloud customers in AWS US-based data centers (with more regions to follow). Register your interest here!

Related Articles

Exploratory Data Analysis for Anomaly Detection
Platform
4 Minute Read

Exploratory Data Analysis for Anomaly Detection

With great choice comes great responsibility. One of the most frequent questions we encounter when speaking about anomaly detection is how do I choose the best approach for identifying anomalies in my data? The simplest answer to this question is one of the dark arts of data science: Exploratory Data Analysis (EDA).
Splunk at the Service of Medical Staff
Platform
3 Minute Read

Splunk at the Service of Medical Staff

Given the current circumstances and the pressure medical staff and hospitals are facing in general, access to information is now more critical than ever. Optimising the process of medical exams and enabling alerts and notifications in real-time has become essential.
A Picture is Worth a Thousand Logs
Platform
3 Minute Read

A Picture is Worth a Thousand Logs

Splunk can be used to ingest machine-learning service information from services like AWS recognition, what does that look like and how can you set it up?
Bringing You Context-Driven, In-Product Guidance
Platform
1 Minute Read

Bringing You Context-Driven, In-Product Guidance

Splunk is providing in-product guidance right at your fingertips to help you accomplish your goals without navigating away from the product. Learn more in this blog post.
Splunk AR: HoloLens and Unity SDK
Platform
2 Minute Read

Splunk AR: HoloLens and Unity SDK

Get a sneak peek on two private beta products — AR app for HoloLens, a solution for a hands-free experience, and a Splunk SDK to allow you to securely incorporate Splunk data into your custom apps.
Threat Hunting With ML: Another Reason to SMLE
Platform
4 Minute Read

Threat Hunting With ML: Another Reason to SMLE

This blog is the first in a mini-series of blogs where we aim to explore and share various aspects of our security team’s mindset and learnings. In this post, we will introduce you to how our own security and threat research team develops the latest security detections using ML.
Creating a Fraud Risk Scoring Model Leveraging Data Pipelines and Machine Learning with Splunk
Platform
8 Minute Read

Creating a Fraud Risk Scoring Model Leveraging Data Pipelines and Machine Learning with Splunk

One of the new necessities we came across several times was that the clients were willing to get a sport bets fraud risk scoring model to be able to quickly detect fraud. For that purpose, I designed a data pipeline to create a sport bets fraud risk scoring model based on anomaly detection algorithms built with Probability Density Function powered by Splunk’s Machine Learning Toolkit.
Levelling up your ITSI Deployment using Machine Learning
Platform
2 Minute Read

Levelling up your ITSI Deployment using Machine Learning

To help our customers extract the most value from their IT Service Intelligence (ITSI) deployments, Splunker Greg Ainslie-Malik created this blog series. Here he presents a number of techniques that have been used to get the most out of ITSI using machine learning.
Smarter Noise Reduction in ITSI
Platform
8 Minute Read

Smarter Noise Reduction in ITSI

How can you use statistical analysis to identify whether you have an unusual number of events, and how can similar techniques be applied to non-numeric data to see if descriptions and sourcetype combinations appear unusual? Read all about it in this blog.