Syncing Lookups Using Pure SPL

If you have seen my previous post “Upgrading Linux Forwarders Using the Deployment Server”, you can see that I love figuring out how to do unconventional tasks using Splunk. I was working with a customer a couple weeks ago who has several search heads and wanted a way to sync lookup files without relying on third party tools such as rsync. Since Splunk is a very open platform, I knew this could be accomplished using a custom REST endpoint. However, I wanted to use pure SPL so this solution could be completely portable, and usable without installing additional apps.

Knowing that Splunk can search a specific search peer using the splunk_server parameter, I added the source search head to the destination search head. I was hoping the inputlookup command allowed for the use of splunk_server, but it didn’t. I began looking at existing REST endpoints and realized there was not one that would retrieve the contents of a lookup file. I then knew the solution, I needed to figure out a way to run the inputlookup command remotely. I knew I could run a curl command from the operating system, execute any search, and retrieve the contents of a lookup using Splunk’s robust REST API. I then realized I could do the same thing using rest command on a search head.

I setup two search heads in my lab environment, sh1 with a “demo_assets.csv” lookup and sh2 without the lookup.

SH1 - Original Lookup

SH2 - No Lookup

I then added SH1 as a search peer to SH2.

SH2 - Search Peers

Using the following search, I could retrieve the contents of the lookup file named “demo_assets.csv” from sh1:

| rest splunk_server=sh1 /services/search/jobs/export search="| inputlookup demo_assets.csv" output_mode=csv | fields value

If you run this search, you will notice the contents of the lookup are merged into a single value.

Splunk rest lookup single value

I created a macro with some SPL magic that retrieves the lookup and reformats the contents into a table.

This output can then be piped to the outputlookup command and written to a local file.

Automating this transfer is now as simple as creating a scheduled search.

Syncing lookups between your development and production or Enterprise Security and Ad-hoc search heads is no longer a problem! Feel free to install the SA or simply copy and paste the SPL from the macro as needed.

You can view the complete app on Github. Use at your own risk.

Luke Netto
Posted by

Luke Netto

Luke is a Staff Professional Services Consultant at Splunk with experience in data analytics, security, networking, systems, wireless integration, and software development. Luke has been an adjunct professor at the University of Denver teaching courses in SQL, Python, and data analytics. He holds a Master of Science in Telecommunications and graduate certificate in Energy Communication Networks from the University of Colorado Boulder, a Master of Science in Telecommunications Engineering Technology from Rochester Institute of Technology, and a Master of Business Administration from Clarkson University. Luke enjoys the challenge of onboarding new data sources and enabling organizations to become data-driven using Splunk.