… is a scurvy dog. And unfortunately the only constant in life.

If you’re a developer it’s probably a small miracle you’re reading this at all. Between solving complex problems, adhering to best practices and coding conventions, architecting for scalability and maintainability, giving a nod to backwards compatibility, and driving towards code complete under aggressive deadlines, troubleshooting problems in production environments is just another squeaky wheel in the machinery. Learning a tool to work with log files on production systems ranks somewhere near volunteering for a lobotomy.

We understand. We have been in your shoes.

If your access to production systems has been revoked, your options may include (but are not limited to) kicking and screaming. We can relate. Many good citizens have employed this approach. An alternative is to get productive with Splunk. If you want to get some work done, then let’s get started.

When you are accustomed to being a master of grep, awk, sed and even writing your own scripts to process log files, learning Splunk can feel like riding a bike for the first time. Remember that? It wasn’t anything an open mind, some courage and band-aids didn’t get you through. With less bleeding and a short hour or so you can learn the basics of Splunk and have the skills to go further and get to the finish line faster.

Below are some tips to help you feel more comfortable, less exposed. Think of these as a safety net when you’re in a bind and haven’t yet become a Splunk ninja.

Show Source

To retrieve all events from a single log file in Splunk, search on “source=/path/to/that/awesome/data.log.” The source field is by default the full path to any log file Splunk monitors. This is the closest equivalent to opening the file in a text editor in a terminal. The big difference is Splunk will present the events in reverse chronological order, probably something you’re not yet used to. It will also present each line or block of related lines as separate events, also probably something you’re not yet used to. To view the events in a format you are used to, use the little menu just below the timestamp. This will open a new window presenting events in their surrounding context.

source image link


Splunk also allows you to reconstruct data for review as text, CSV, XML or JSON completely outside of Splunk. Web browsers will accommodate the export of at most 10,000 events without becoming unstable.

Export via CLI

If you’re thinking, Wait, I want to export more than 10,000 events YO, then use Splunk’s Command Line Interface (CLI) to export to your heart’s content. Details are covered in this blog post: Help! I can’t export more than 10,000 events!

Stretching Your Monitor to Read an Entire Log File

TBD. If you’ve figured this out I’d actually like to know how you did it.

Silly Rabbit, Trix are for Kids

We would be hugely misguided in investing thousands of engineering hours to solve the narrow problem of exporting logs off production servers. Now that you know about the above options don’t let them be your crutch.

Let the training wheels come off.

Use Splunk the way it was meant to be used – as a powerful navigation/correlation/trending/analytics engine for any type of data. You may never look at log data the same way again. Is that such a bad thing? To gain operational intelligence from analyzing infrastructure conversations or web hits or purchasing transactions or billing details, you’re only limited by creativity and versatility. We know you have these qualities in spades. That’s why you’re a developer.

So go here, embrace that scurvy dog and get back to the work you love:

Vi Ly

Posted by