First bite of the Cherry(py)
I didn’t always work at Splunk. In fact, many moons ago I used to be a Splunk customer. At the time we were simply looking for a means to better consolidate our enterprise’s numerous sources of log data into a centralized repository. A colleague of mine mentioned this product called Splunk , and hence the journey began. Like many, this started with getting some log files indexed into Splunk and creating some trivial searches and Simple XML dashboards. This very quickly led to more data sources and more elaborate dashboards. Then the bloke sitting next to me saw what I was doing and wanted in on the action, then the adjacent team and then the floor. This internal viral growth required setting up a simple Splunk cluster under my desk with user/role access controls, data retention policies, backups etc., But more importantly, this was when I really started to become a Splunk Developer.
Appetite for construction
I’m a coder. I like making things. And when it comes to working with other products, platforms & frameworks, I am naturally drawn to those that allow me to open them up and extend, customize and augment them to my specific needs. This doesn’t necessarily mean open source, but they need to have an open architecture with as many developer “hooks” as possible. This openness is also what leads to and drives community and collaboration. A thriving community is the bedrock of a platform’s developer ecosystem, and within this realm great ideas are seeded and emerge.
Even though my Developer journey started with Splunk several versions ago, there were already enough developer hooks in the product to satisfy my needs and allow me to shape Splunk to suit the dynamics of the environment in which I was utilizing it. Proprietary closed platforms force you to adapt to their way. It should be the opposite. You should be able to adapt the platform to your way, your data , your requirements. And this adaptation should be as simple and timely as possible to accomplish. Splunk ticked all my boxes.
All you can eat buffet
The Splunk Developer landscape today is vast and growing, with hooks into numerous areas of the platform.
What follows is an overview of all the main areas where you can currently develop atop the Splunk platform.
In many ways it is great to be spoiled for choice, it adds to the agility of the platform. But sometimes, so much choice can be overwhelming, especially to the newbie developer. So I will provide a brief overview of each development hook, with links to more substantial documentation as well as my take as to why you might consider that particular development option.
CLI (Splunk’s Command Line Interface)
You can use the Splunk CLI to monitor, configure and search Splunk via a terminal/shell interface or wrapping the commands in a shell script.
Consider using this if you are not able to program to the REST API or use a language SDK and are more suited to simple shell commands.
Furthermore, the CLI has some functionality over and above what you can do with REST i.e.: start/stop the server, clean indexes, additional troubleshooting tools and more.
You can interact with most of the functionality of Splunk via the exposed REST API. This will typically be managing, searching and inputting data. Just think of your experience using Splunk Web. What you can do there, you can perform programmatically directly via REST.
You can perform this interaction from the command line using a program such as CURL, or programmatically in your code.
You might want to use the REST API directly if you are unable to use our language SDK’s or perhaps you are using a language that we don’t currently have an SDK for i.e.: R, Perl, or you might be hitting a custom REST endpoint that isn’t available via an SDK.
SDKs (Software Development Kits)
You’ll typically want to use an SDK by leveraging your existing developer language skills to utilize the Splunk REST API to code solutions that integrate with Splunk, be it a data only integration or perhaps a custom user interface, or you may have a requirement to utilize the core Splunk platform to build a standalone big data solution on top off it. The reality is, the potential use cases are numerous.
Splunkbase Apps and Add-ons
At a very simple level Splunk apps & add-ons are a packaging up of the various configurations, searches, knowledge objects, UI components and customizations, inputs, role definitions, field extractions etc. that you might typically create via Splunk Web. An App will typically have a user interface that sits atop multiple other Splunk objects. An add-on typically serves a single reusable purpose to extend the Splunk platform i.e.: a custom input, and won’t have a UI or a setup screen. Apps are typically themed around a specific use case i.e.: Splunk for Active Directory, whereas an add-on will generally be generic and reusable across many diverse use cases i.e.: SNMP Modular Input.
You’ll want to create an app or add-on if you have something you want to share with the community (free or charged) on Splunkbase.
The power of the community working together is a great thing, you might contribute an app, and conversely someone else might have contributed an app that you can benefit from. Everyone wins. Furthermore, it promotes modularity and reuse, all good things for Splunk productivity. Also , many Splunk partners create apps to integrate with their products , hence leveraging these connected communities for the market.
You might also choose to create apps at your organization and share these internally. Building out your internal Splunk apps not only promotes modularity but makes it easier to secure access to a multi user/multi departmental environment via Splunk user/role based permissions and access policies.
You can also bundle up code that isn’t a traditional Splunkbase app by definition, but is something you want to share on Splunkbase i.e.: a code library or perhaps something you have created using an SDK.
Out of the box, Splunk has simple generic input options available for getting data from a file or receiving data over TCP/UDP.
But you can also create your own custom scripts, in any language, to obtain data from any source. These are called scripted inputs.
You can then bundle these up as Splunk add-ons, share them on Splunkbase and also browse Splunkbase for any scripted inputs others have created.
You’ll want to create a scripted input when the data is not available by file or UDP/TCP, or perhaps you want to perform some additional pre-processing of the raw data before sending it to Splunk. Scripted inputs have been largely superseded since Splunk 5.0 with Modular Inputs, but their bare bones simplicity and speed of development still make them an effective tool in the developer’s arsenal. And if you have a Splunk environment pre 5.0, then Modular Inputs are obviously not available.
Modular Inputs build upon Scripted Inputs by elevating the creation of custom input add-ons to first class citizen status in Splunk. By that I mean that once you install them , it is as if they are natively part of Splunk, just like File or TCP inputs. You are still achieving the same basic premise as with Scripted Inputs , providing some custom, reusable way of getting at data , but a Modular Input is tightly integrated into the Splunk lifecycle(install , logging, validation , runtime, setup page in the Splunk Manager UI, manageable via REST etc..) and the experience for the end user is much simpler and easier to configure access to their data.
I would advocate creating a Modular Input if you are using Splunk 5+ and wish to provide the simplest and most seamless experience for your users in setting up access to their data sources. And just like with Scripted Inputs , you can use any language you like , although I’d generally recommend Python as Splunk comes with it’s own Python interpreter and this should provision you Modular Input as being platform independent.
Custom search commands
Splunk’s Search language is incredibly powerful and extensive for deriving a wide range of analytical insights from your indexed data. But there may well be times when a particularly Search command doesn’t quite do what you want or you have a need for an entirely new search command. The good news is that you can extend the Splunk search language with your own custom commands. These can also be bundled up as Splunk add-ons and shared on Splunkbase. Search commands will be written in Python.
I would create a custom Splunk command when you have need for search processing that is not available out of the box, have a need for frequent reuse of the custom search command logic, and you want to maintain an integrated approach for your users to processing your data by keeping the processing logic as single searches/saved searches. By “integrated approach” I mean that it could alternatively be possible to perform part of the search, output the results via REST and then perform the additional processing from some custom code.
Data Models allow Splunk developers to abstract away the underlying search language and work at a knowledge level without having to have specific knowledge about the semantic and meaning of the raw indexed machine data. A Data Model is represented as a JSON document which is a hierarchically structured search time mapping of semantic knowledge in the machine data. This leads to portability and reuse and allows developers to focus on coding and not the Splunk Search language.
Custom alert scripts
Splunk alerting channels provided by defaults are Email and RSS. But let’s say for example that you wanted to send alerts via SMS, to a Messaging Queue, as an SNMP or directly to a trouble ticket system. Then you can code you own scripted alerts to perform this functionality in any language.
You will pretty much always need to go down this path if you have an alerting requirement beyond Email/RSS. However it would also be possible to code some external alerting program that is scheduled and executes searches / looks up saved search results via the REST API and then sends alerts to some channel based on the your alerting criteria being met.
Custom REST endpoints
Splunk’s REST API is very thorough, but this can also be extended with your own custom REST endpoints that you could then integrate with programmatically. You would have to use the REST API directly as custom REST endpoints wouldn’t have interfaces in our language SDKs. Custom REST endpoints can also be accessed via an Apps setup page in Splunk Web.
You will typically write a custom REST endpoint when then is specific server side functionality you require that is not exposed or available via the standard REST API.
Custom authentication handlers
Splunk ships with 2 authentication mechanisms, internal Splunk authentication and LDAP.
But Splunk also has a scripted authentication API for use with an external authentication system, so you can develop you own authentication handlers.
When you require authentication to external systems other than LDAP, you will need to create a custom authentication handler.
External scripted lookups
Splunk lookups allow you to use information in your events to determine how to add other fields from external data sources so as to enrich the semantic of your data. A typical use case might be a static lookup table that matches errors codes to human readable descriptions of those error codes.
But the lookup does not have to be a static table , you can develop your own custom programs to obtain the lookup information from some external source.
You should consider using this option if static lookup files are not dynamic or performant enough or if the lookup data is being hosted externally.
The App Framework
The Splunk App Framework allows you build your own custom MVC apps and components that run inside the Python based Splunk Web Application Server.
The App Framework layer API provides the abstraction needed to easily integrate your application into core Splunk software. The API provides access to the same libraries that power Splunk Web, including the REST API, CherryPy application server, and Mako templating facilities.
You should consider going down this path if you need to create custom Advanced XML modules and View Templates (for presenting a custom UI experience), creating custom controllers(for interactions from the browser) or create custom model components (for interacting with SplunkD). The code you develop can also be bundled up as an app/add-on and shared on Splunkbase i.e.: a package of custom D3 visualizations.
The Web Framework , (new in Splunk 6)
Choosing this approach is really a question of is this going to make it simpler for developers to create Splunk web apps and is it going to lead to greater productivity for you. The older App Framework is very powerful but requires a particularly proprietary skillset to get productive. The new Web Framework is about allowing you to take advantage of cross platform skills you may already have.
HUNK (Splunk Analytics for Hadoop)
HUNK is a new product offering from Splunk, that allows you to leverage the power of Splunk to unleash insights from data stored in Hadoop, http://www.splunk.com/view/SP-CAAAH2F
This is great news for Hadoop developers because you can now transparently use all of Splunk’s developer options for your Hadoop development , notably our language SDK’s for searching over the data locked away in HDFS.
Developers should consider using HUNK if you’ve started to fall into the “trough of disillusionment” trying to get insights from data locked away in Hadoop HDFS , and want to take advantage of the Splunk platform and developer offerings to more simply and rapidly obtain data insights.
Storm REST API
Splunk Storm is a cloud based Splunk service with much of the same functionality as Splunk Enterprise for getting insights into your machine data. Product offering differences are listed here : http://www.splunk.com/view/product-comparison/SP-CAAAG59
The Storm REST API is actively being enriched.Currently you can programmatically send data into Storm via a REST endpoint.
Code examples : https://github.com/splunk/storm-examples
You might want to consider this option if you don’t want to install Splunk enterpise on your own infrastructure , and you simply want to index some data into the cloud into a turnkey, fully managed and hosted service for your machine data that can scale with your data requirements.
And now , you can get 20GB for free !
Mobile Analytics with Bugsense
Recently Splunk acquired mobile analytics company BugSense. Mobile SDKs are available for Android , iOS, Windows, HTML5 to allow developers to very simply get insights into how their mobile apps are behaving.
Coding outside the square
You can also create whatever other tools, utilities & libraries related to Splunk that you want, you don’t have to rely directly on Splunk hooks. And there are already some cool projects on Github created internally and by the developer community. Here’s a sampling of them.
IOS Logging Libraries : https://github.com/carasso/splogger
Splunk Storm Mobile Analytics : https://github.com/nicholaskeytholeong/splunk-storm-mobile-analytics
Android SDK for Splunk : https://github.com/damiendallimore/splunk-sdk-android
SplunkJavaLogging : https://github.com/splunk/splunk-library-javalogging
SplunkJavaAgent : https://github.com/damiendallimore/SplunkJavaAgent
Powershell Resource Kit : https://github.com/splunk/splunk-reskit-powershell
FluentD Connector : https://github.com/parolkar/fluent-plugin-splunk
Flurry Connector : https://github.com/splunk/splunk-flurry
Apache Camel Component for Splunk : https://github.com/pax95/camel-splunk
Spring Integration Adaptors for Splunk : https://github.com/SpringSource/spring-integration-extensions/tree/master/spring-integration-splunk
You really are only limited by your imagination !
A few more morsels to take away
That’s quite a selection of development options isn’t it? Hopefully this has given you a pretty good overview of the landscape that’s currently out there for Splunk developers.
Here’s a few other links/resources to further help you out in your Splunk developer journey:
Free Developer License : http://dev.splunk.com/page/developer_license_sign_up
Splunk Dev : http://dev.splunk.com
Splunk Docs : http://docs.splunk.com
Github : https://github.com/splunk
Twitter : @splunkdev , @damiendallimore