Splunk > Clara-fication: Dashboarding Best Practices

So you want to build a better dashboard, do you? Well good, you’ve come to the right place! 

Splunk dashboards are amazing. They are incredibly versatile and customizable. The creation of a dashboard is incredibly simple and can be done all through the UI. If more in-depth customization is required, that can be done through the SimpleXML using HTML panels, in-line CSS, or by uploading a new app from Splunkbase or custom JS/CSS. If running Splunk Enterprise version 8.x and above, there’s even a cool Splunk Dashboards app to check out that allows for even more features and utilizes JSON over SimpleXML. 

We’re going to dive into some important topics regarding the best practices when creating and maintaining a dashboard. Let’s hop to it!

Naming Standards

Dashboards have both names and ids, it is best to have a naming convention for the id, and if possible, the name. 

One naming standard could be to begin the id and name with a unique identifier, such as the team, customer, or tool the dashboard is for. Implementing a naming convention can make it much easier to find dashboards, and to add dashboards to the navigation app bar. 

Another thing to keep in mind is when cloning a dashboard, remember to keep the id unique. Either keep _clone at the end of the id or replace it with _v2, your username, or something similar. If multiple dashboards exist in the same app space with the same id, even when the app title differs, an issue arises where changes made in either dashboard are saved in both dashboards. This is most definitely not a best practice! Make sure you keep your ids unique!

When using REST API to query dashboard configurations, it will use the id, not the dashboard title, so keep that in mind when naming, as well.

Design and Layout

It would be amazing to have a standard dashboard template for various teams or various use cases, however, no two use cases are ever really the same. Just follow these layout and design guidelines and everyone should get along just fine. 

The dashboard should be built for the use case, and nothing more. Dashboards only need to display necessary information to the use case. Keep the most important information at the top and related information together. 

If there are any custom visualizations being used, it’s a good idea to open the dashboard on various browsers and change the screen size to ensure that the layout doesn’t change. If the visualization seems buggy, it’s good to either fix the issues or make it known that it might not render properly on certain browsers.

Titles, descriptions, labels, etc. are all a user’s best friend when trying to understand a dashboard. Providing additional documentation with references, such as contacts and field definitions, can be extremely helpful; these can be added in an HTML panel within the SimpleXML dashboard. My personal preference is to create the necessary documentation at an external location and add the link to the dashboard. 

On the topic of adding titles, it is recommended to use title case for dashboard, panel, or element titles (ie: Capitalize the Important Words). Titles should also be short and to the point.

It is recommended to use sentence case for descriptions and labels (ie: These are some good looking dashboards!).


  • Don’t have multiple panels displaying the same thing. There are multiple pie charts displaying Last Month and This Month data and then subsequent column charts with the exact same information. 
  • This entire dashboard is essentially different visualizations for the same data, except for the last table.
  • There are two separate tables that could very easily be displayed as one table.
  • This dashboard has titles on every panel and labels mostly, but use legends when appropriate.


  • Combine any duplicating information and remove redundant visualizations. Consolidate the rest of the panels and reorganize dashboard structure to remove excess white space. 
  • Add descriptions under single values if it adds value or context.
  • If adding legends to visualizations, adjust the placement as well as try to keep the legend keys short, so that they are able to display without ellipses. 

When choosing a visualization for a panel, pick the one that best displays the data. For example, if the data is trending over time, line charts or columns charts work well. Use a pie chart if displaying the entire composition for a defined field. There are many types of visualizations available out of the box with Splunk, and even more that can be installed via Splunkbase, so think about how best to display the data and find the right visualization for it.

Dashboard Pointers

Key takeaways from this old fake-beard:

  • With all knowledge objects, version control is so important. One never knows when something could crash or changes were made that needed to be reverted. Use products like Git to help version and peer review code. This doesn’t mean that no development or changes should be done in Splunk, but sometimes it really helps to save the final product somewhere, just in case it needs to be referred back to later. 

  • The above leads me to code reviews. With small and large dashboards, there is so much room for inefficiencies. It’s important to have peers review the code in order to ensure it is built efficiently and working properly and it never hurts to have others check that the searches are also accurate or do not contain any typos. If there is automation in place, that is great! If not, it’s not a big deal to have everything done live on Splunk and just have these reviews done as backups.

  • I cannot stress this enough: it’s all about that base. Base searches can increase the efficiency of a dashboard tremendously. If there are two or more panels (including form inputs) that use the same set of data, create a base search that those panels (and/or form inputs) can search from. 
    • Adding just a little more stress from the above, when creating base searches, make sure they transform! Base searches should not, I repeat, should not, be raw events. The base searches should include a transforming command, such as stats, chart, or timechart. This is because there is a limit on the number of events a base search will return when a transforming command is not used. 
    • A caveat with base searches is that you cannot export data from the panels - you would need to open the panel in search and export from there. 
    • Another thing to remember about base searches is that any post-process search utilizing the base search is limited to only the fields and results that the base search produces. 
    • Try running the following search to get an idea of dashboards running searches that might benefit from base searches.
index=_audit host=<host> action=search info=completed provenance=*Dashboard*
|rex field=_raw ", search=\'(?<thesearch>.*)"
|stats dc(search_id) as search_count sum(total_run_time) as total_run_time earliest(thesearch) as search by app provenance savedsearch_name
|cluster showcount=true t=.7 labelonly=true
|fields - _raw
|sort - cluster_count cluster_label
|eventstats sum(total_run_time) as cluster_run_time by cluster_label
|eventstats sum(total_run_time) as stack_total_run_time
|eval cluster_pct=cluster_run_time*100/stack_total_run_time
  • When creating searches for the dashboard, follow Splunk Search Best Practices.

  • Dashboards do not need to be run in real time. Real time searches consume Splunk resources that could be utilized by other searches. If recent data is needed, another method is to have the latest time as one minute ago and set the panel to refresh every minute using the <refresh> option, for instance. 

  • The above leads me to refresh options. Both the entire dashboard as well as individual panels can be refreshed automatically. Use this only when needed. These options are great for when a dashboard is always on a monitoring display and needs to be refreshed automatically, not for dashboards that will be opened once or twice a day by a handful of people to check ad-hoc. Also, please consider using the refresh setting at the search level instead of at the dashboard level, as this can also be impactful to the entire Splunk environment. The refresh intervals should also be set to a reasonable interval. If the data behind the dashboard is indexed every hour, it would not make sense to set the refresh interval for every minute.

Instead of

<form refresh=”300”>


<dashboard refresh=”300”>
  • Dashboards should not house everything under the kitchen sink. Dashboards that have so many things in it can start to load slowly and even the source code can become hard to load and navigate, if it’s large. It’s best to create drilldowns or hyperlinks to other dashboards or reports for related content, instead of having everything on the same dashboard. 

  • You can also comment out one line or multiple lines of dashboard code! Isn’t that great! This is incredibly useful when maintaining a dashboard. Just use <!-- at the beginning of the comment and --> at the end of the comment, no matter how many lines! The only caveat is that you cannot have a comment within a comment, or even just a -- within the comment. It should be noted that once edits are made in the UI, any comments will be wiped out from the source code.
!--This is an example comment that would explain what my below panel and search might be doing -->
  <title>Panel 1</title>
  • If you have saved searches that need to be in a dashboard panel, you can reference them by either ref, | loadjob, or | savedsearch.

    <search ref=”savedsearch_name”>
       <query>|loadjob savedsearch=”owner:app:savedsearch_name”</query>
       <query>|loadjob savedsearch=”owner:app:savedsearch_name” ignore_running=false</query>
       <query>|savedsearch savedsearch_name </query>
  • Both ref and | loadjob will load the most recent results from the saved search, if they exist. If no results exist for the scheduled saved search, ref will run the search and | loadjob will display an error. | loadjob will also display an error if the job is currently running, however that can be fixed by adding ignore_running=false to the command. | savedsearch will always run the saved search.

  • Form inputs can load very slowly if they have thousands of fields. If this is the case, my suggestion is to create an Input of Inputs, where one input filters down the next, either by the alphabet or by another input that already exists. 

Example: If you have an input of employee names, perhaps first create an input of employee last initial, to filter the employee names. Excuse the random run-anywhere code. 

<search id="name_base"> 
    <query>|makeresults|eval alphabet="A B C D E F G H I J K L M N O P Q R S T U V W X Y Z"|makemv alphabet|mvexpand alphabet</query>
<input type="dropdown" token="eli"> 
    <label>Employee Last Initial</label> 
    <search base="name_base"></search> 
<input type="dropdown" token="en"> 
    <label>Employee Name</label> 
    <search base="name_base"> 
        <query>|eval name="test ".alphabet."foo"|table name|eval last_name=mvindex(split(name," "),1)|search last_name=$eli$*</query> 

  • A good practice is not to use All Time or any other “All” inputs as default, as this is resource intensive. Set the default values to the primary use case of the dashboard. Add a “Submit” button if there are multiselect inputs, to control the multiple input-changes. 

  • Dashboard tokens can be added to the panel titles and descriptions to add dynamic context, such as time range or the value passed through a drilldown. 

  • It also helps to display date and time ranges with the timezone to add context if sharing an image of the dashboard or exporting a PDF.

  • One thing I wasn’t aware of until somewhat recently was that searches can have the depends argument added to it, similar to adding it to panels when hiding or displaying them, so that a search will only run when the depends statement is true. This can be so helpful if there are drilldown searches that don’t need to run when the dashboard is loaded, for instance.
<dashboard theme="dark">
    <unset token="dont_run"></unset>
    <set token="run"></set>
        <title>Unset Token</title>
        <search depends="$dont_run$" id=”dont_run”>
          <query>|makeresults|eval data="Can you see me?!"</query>
        <option name="drilldown">none</option>
        <option name="rangeColors">["0x53a051","0x0877a6","0xf8be34","0xf1813f","0xdc4e41"]</option>
        <option name="refresh.display">progressbar</option>
        <option name="underLabel"># tickets closed last month</option>
        <title>Set Token</title>
        <search depends="$run$" id=”run”>
          <query>|makeresults|eval data="Can you see me?!"</query>
        <option name="drilldown">none</option>
        <option name="rangeColors">["0x53a051","0x0877a6","0xf8be34","0xf1813f","0xdc4e41"]</option>
        <option name="refresh.display">progressbar</option>
        <option name="underLabel"># tickets closed last month</option>

  • Last, but certainly not least, set an id to each search! Whenever there comes a time that a dashboard search needs to be found in the audit logs, if the search has an id attached to it, it becomes much easier to find. Troubleshooting made easy...well, easier.
<search id=”base_search1_dashboard1”>

Will show up in the audit logs with savedsearch_name=base_search1_dashboard1.

Ready, Set, Dash!

This post did not go into many details, because every environment is different, and everyone should create their own best practices. This is meant to be used as a starting point of things to think about when creating dashboards. Most points highlighted here can also be applied to the Splunk Dashboards App, even though my examples were focused around SimpleXML.

Other Resources

March 2021, Splunk Enterprise v8.1.3

Clara Merriman is a Senior Splunk Engineer on the Splunk@Splunk team. She began using Splunk back in 2013 for SONIFI Solutions, Inc. as a Business Intelligence Engineer. Her passion really showed for utilizing Splunk to answer questions for more than just IT and Security. She joined Splunk in 2018 to spread her knowledge and her ideas from the source! She's also a huge advocate for the Splunk community and has been part of the SplunkTrust since 2018.

Show All Tags
Show Less Tags