What is Splunk Virtual Compute (SVC)?

A Splunk Virtual Compute (SVC) unit is a powerful component of our workload pricing model. Historically, we priced purely on the amount of data sent into Splunk, leading some customers to limit data ingestion to avoid expense related to high volumes of data with low requirements on reporting. With Splunk workload pricing, you now have ultimate flexibility and control over your data and cost. With workload pricing you pay for the value you receive from your data, which is based on the workloads performed against the data ingested into Splunk.

Let’s take this opportunity to get specific on what makes up workload pricing, what SVCs are, and how to size, monitor, and manage workload as your needs grow with Splunk. 

The Workload Based Pricing Components of Splunk Cloud Platform

The Splunk Cloud Platform Workload Pricing model is comprised of two components: 

  1. Splunk Virtual Compute (SVCs): the compute and related resources required to support search and ingest workloads
  2. Storage Blocks: the number of terabytes of storage required to meet your data retention policies 

More About SVC

In the workload pricing model, the price you pay is a function of the resources needed to drive the workloads you want to accomplish, as well as the storage needed for the data you want to analyze. Splunk Virtual Compute (SVC) is the unit of measure for the first part of that equation; the resources needed to drive your workloads. 

Examples of workloads are compliance, data lake, basic reporting, ad-hoc investigations, and continuous monitoring. To power these workloads, SVCs utilize two primary factors: search and ingest. Each workload has its own profile of search and ingest. See the graphic below for some of the most popular workload profiles:  

By splitting out the search and ingest factors, you now have far more flexibility in how you utilize your resources. You can see your SVC utilization in the Splunk Cloud Monitoring Console (CMC), which provides full visibility into resource consumption.

More About Storage Blocks

With workload pricing we’ve decoupled the compute and storage components, and you can now tailor each independent of the other. Storage blocks are purchased for however long you’d like to store your data. 

Splunk Cloud offers the following storage types in 500 GB blocks to account for a diverse set of use cases and retention schemes:

  • DDAS, Dynamic Data Active Searchable: readily searchable data in Splunk Cloud.
  • DDAA, Dynamic Data Active Archive: Splunk: managed data archive that automates the rehydrating of data back into Splunk Cloud. 
  • DDSS, Dynamic Data Self-Storage: customer-managed data storage in which a customer must rehydrate data into their own environment to search that data.  

See this blog, Dynamic Data: Data Retention Options in Splunk Cloud, to learn more about DDAA and DDSS.

How Does Splunk Deliver SVCs?

An SVC is a unit of compute and related resources that provides a consistent level of search and ingest equal to the SVC performance benchmark. This Splunk-created benchmark ensures that SVCs continue to provide the same or better levels of performance as underlying infrastructure or software configurations evolve. One big benefit to our customers is the flexibility this provides to take advantage of Splunk Cloud Platform performance improvements and new cloud-based platform services.

How is SVC Consumption Measured?

SVC is a unit of cloud compute, IO, and memory resources of the Indexers and Search Heads in your Splunk environment. As of Splunk Cloud Platform 8.2.2106 with CMC version 2.3.3, we capture SVC utilization measurements for each machine every few seconds. We calculate SVC usage for your Splunk Cloud Platform environment by aggregating these utilization measurements across all the machines for each hour. When aggregating the granular measurements into hourly numbers, we take care to remove the effects of unexpected outliers. You can view your hourly SVC consumption anytime on the Cloud Monitoring Console.

SVC Sizing: How Many Do I Need?

The total number of SVC’s you will need is equal to the maximum compute resources used during your peak window of usage. 

If you have high bursts of usage needs throughout the day culminating in a very high moment of use but followed by many low use periods, don’t sweat it! For example, the minute after midnight tends to be a high burst of usage — given that all daily, hourly, minute by minute searches often overlap at that time — like in the image below.

You have full control to manage your workload by spreading out any workload that does not need to necessarily overlap. You also have the ability to optimize other factors that drive workload, such as search usage, apps and the number of users. 

When your sustained workload is approaching your SVC number consistently throughout the day, you need a higher amount of SVCs to tackle that load. 

Workload Pricing and Sizing for New Customers

For those brand new to Splunk, we recommend you begin by identifying a range of SVCs based on your anticipated workloads. Remember, each profile is based on two primary factors: search and ingest. The combination of those factors is what drives your SVC usage. Historically speaking, your search profile will likely be the biggest driver of SVC usage. Below is a customer example showing the types of searches they execute for different use cases and the ingest volume range in relation to SVCs needed to accomplish each. We include a range because SVC usage may vary based on the complexity of the data ingested and searches executed.

Customer example: SVC usage by use case

You can use these example ranges as rough guidelines for sizing your SVC needs. The Splunk sales team can help you estimate the appropriate GB/Day per SVC for your workloads. We use these estimates to help size your SVC purchase. 

As an example, if your primary use case was compliance storage, you may bring in around 35 to 45 GB/day per 1 SVC. 

You can use this formula to calculate the number of SVCs you’ll need based on how efficiently you believe you can operate Splunk:

Total Volume in GB / GB per SVC Ratio = Number of SVCs

For example, if you have 1500 GB of ingest, of which 800 GB is used for compliance storage and 700 GB for continuous monitoring, the SVC sizing calculations would be as follows:

  • Compliance Storage: 800 GB/(a range of 35-45) GB/Day per SVC = a range of 18-23 SVCs
  • Continuous Monitoring: 700 GB/(a range of 10-20) GB/Day per SVC = a range of 35-70 SVCs

Existing Splunk Enterprise Customer Sizing

For existing On Premises or Bring Your Own License (BYOL) Splunk Enterprise customers, we have published a Splunk Cloud Migration Assessment App that will help you collect data points and automate your assessment. This is extremely useful if you already have a Splunk deployment that is addressing several different use cases.

Existing Splunk Cloud Platform Ingest Customer Sizing

Existing Splunk Cloud Platform ingest customers looking to migrate to workload can work directly with their Splunk account team. Splunk Cloud Platform already has all of the metrics needed to assist your team with the recommendation.

To Sum It Up...

It’s easy to see how the workload pricing model allows you to truly unlock the power of the Splunk Cloud Platform. This is because your investments in Splunk are more aligned with the compute power you use to deliver value. Finally, it gives you the flexibility to ingest a lot more data upfront. Now you can go about exploring different use cases without worrying about having to pay per GB — all with complete control to monitor your usage through the Cloud Management Console (CMC).

Read Workload Pricing and SVCs: What You Can See and Control for additional details on what you can see and control in the CMC.

Anna Mensing
Posted by

Anna Mensing

Anna is the Director of Product Marketing for Splunk’s Platform and Machine Learning products. She works closely with customers to help them understand how their data can reveal insights across Security, Observability and more. She has 12+ years of experience bringing to market SaaS and software solutions in technology and public sector industries. Anna holds an MBA from Duke University and a degree in Systems and Information Engineering from the University of Virginia. Outside of work, Anna enjoys traveling, reading science fiction, trying out new cooking recipes, hiking and exploring the Washington, DC area!

Show All Tags
Show Less Tags