The Splunk App for Stream – Tracking Open Ports for Security and Compliance – Part 2

In   Part 1 of this post we looked at using the Splunk App for Stream to look for open ports on your networked systems.  (Hint: Follow the ACK packets.)  This post looks at how to keep track of those open ports, and how to detect when a NEW port starts listening.


Of course, Splunk is an extensible tool that gives you the ability to solve problems like this a number of different ways.  The method I’ve chosen to use for this case is the Splunk Key Value Store.  This is a new feature in Splunk 6.2 that lets you read and write data within a Splunk app, allowing you to maintain state in that application.  Think of storing user metadata, or caching results from a search query by Splunk or an external data store.  In this case, think of maintaining a list of host IP addresses, open ports on each IP, and when they were last seen.  These three data points alone should be enough for us to discover a NEW listening port.


The Splunk Key Value Store or KV Store works like a CSV lookup, but it does so much more.  You can do Create-Read-Update-Delete (CRUD) operations on individual records within a collection, you can define field acceleration to improve search performance, and you have the option of data type enforcement when writing data.  Plus the KV Store is built for performance when handling larger data sets with frequent lookups.


KV Store data is managed per app within Splunk.  I recommend creating a new bare bones app just for testing things out.
  1. On the home page of Splunk Web on your Search Head, click the gear icon next to Apps.
  2. Click Create App.
  3. On the Add new page, fill out the properties of the new app:
    1. For Name, enter “Port Status”.
    2. For Folder name, enter “port_status”.
    3. For Template, select “barebones”.
  4. Click Save.


Now create a couple of config files for the KV Store for this app.




external_type = kvstore
collection = port_lookup
fields_list = _key, dest_ip, dest_port, _time


Now restart Splunk.
You can manage the data in your KV Store with three search commands:


     inputlookup – gets search results from a KV Store collection
     outputlookup – writes search results from the search pipeline into a specific KV Store collection
     lookup – match event data from earlier in the search pipeline to data in a KV Store collection


To put data into the KV Store, prepare it as a table and then pipe it to the outputlookup command.  It’s important to point out here that the list of ports that you’re going to check against doesn’t have to be limited to a list of open ports detected by a port scanner as outlined above.  You could just as easily populate the KV Store with a table of approved ports for each IP address according to your network policy.  In fact, why not create two lookups, one to maintain port status and one to maintain a list of whitelisted or otherwise acceptable ports?  Follow these same procedures to determine when a new listening port is on your acceptable list of services.


Here’s how to populate the KV Store with the scanner data.  Basically, run a search to find open ports and pipe it out to a table:


sourcetype=stream:tcp src_ip= dest_ip= ack_packets_in!=0
| transaction dest_ip dest_port mvlist=t
| stats last(timestamp) as time_last_seen by dest_port, dest_ip
| eval _time=strptime(time_last_seen, “%Y-%m-%dT%H:%M:%S.%fZ”)
| table dest_ip dest_port _time




If you’re satisfied with the way things look, go ahead an append the following to that search to pipe the data into the KV Store:


| outputlookup kvstore_lookup


Want to take a look at the data in the KV Store?  Use the inputlookup command:




Once the KV Store is populated with your baseline data, either from an acceptable use policy or from port scanner output, you can start running searches to compare current state versus desired or previous state.


Using the same searches above, we can look for ack packets being sent to IP addresses, and look for combinations that have NOT been seen before.  The following search does that as well as create a new field named “starttime” which will pass its value to workflow.


sourcetype=stream:tcp src_ip= dest_ip= ack_packets_in!=0
| convert mktime(_time) as starttime
| transaction dest_ip dest_port mvlist=t
| search NOT [ | inputlookup kvstore_lookup earliest=0]
| table dest_ip dest_port _time


Save this search as a dashboard panel and call it something like “New Listening Ports”.




Now, if you drill down on this row of data, you’ll be brought to the event itself.  I’ve created a workflow action so that I can add this to my list of known/acceptable ports for this host with a few mouse clicks.




Wanna be cool too?  Use these files:


search = sourcetype=stream:tcp src_ip= dest_ip= ack_packets_in!=0


display_location = both
eventtypes = scanner_traffic
fields = *
label = Add $dest_ip$:$dest_port$ to Known/Acceptable List = port_status
search.preserve_timerange = 0
search.search_string = sourcetype=stream:tcp src_ip= dest_ip=$dest_ip$ dest_port=$dest_port$ ack_packets_in!=0 now=$starttime$ earliest=-1m latest=+1m | table dest_ip dest_port _time | outputlookup kvstore_lookup append=true = blank
type = search
Bert Hayes
Posted by Bert Hayes

Join the Discussion