CUSTOMERS

Splunk Live Washington DC 2009

Obama-nomics is highly visible in our nation’s capitol these days. The DC economy is humming as our tax dollars are hard at working fueling all kinds of government spending.With more than 100 attendees at Splunk Live on Thursday we certainly were not disappointed in our quest to help make all this growth in government more efficient! Managing large networks and security forensics were the hot topics of conversation at Splunk Live Washington, DC where everyone was treated to a trio of three incredible speakers.

Our first speaker was Andy Purdy, the Co-Director, International Cyber Center, George Mason University and the Former Acting Director, National Cyber Security Division (NCSD) and US-CERT Department of Homeland Security. Andy was a member of the White House staff team that drafted the U.S. National Strategy to Secure Cyberspace (2003) and served on DHS tiger team that formed the National Cyber Security Division (NCSD). He was 3 1/2 years at DHS, the last two heading the NCSD and US-CERT as the “Cyber Czar” of the U.S. Andy is also a Special Government Employee on the Defense Science Board Task Force on Mission Impact of Foreign Influence on DoD Software. He is also a partner with the law firm of Allenbaugh Samini Gosheh, LLP.

The Constantly Changing Threat Landscape

Andy talked with us about the changing threat landscape and lessons learned from past approaches to cyber security that can be applied in a forward looking approach to Risk Management and Compliance.

Since much of his experience has been spent preparing the country for what cyber threats are coming next, Andy thinks of IT security as a war fought in a constantly morphing theater with new technologies and vulnerabilities and new motivations and threats.

A Different Approach Moving Forward

For anyone serious about security this is a sound perspective whether you are a government agency, a major enterprise or a small business. But, the balance between open networks and services and robust security remains one of the major challenges for IT organization. Andy pointed us to lessons learned from his past, fueling a vibrant conversation during the customer and speaker roundtable. Perhaps the most important thing I heard was it’s not enough to prepare for the last war, or the last successful attack. While perimeter defense and legacy standards for network security are provide some measure of security, those measure are very often insufficient to deal with the new threats that seem to be gaining in sophistication at an accelerating pace. Andy encouraged us to focus on adopting new requirements and security infrastructure for situational awareness and control.

Greater sophistication, slower, lower-level attacks, greater knowledge about the targets (data, activity, vulnerabilities) are all contributing to the need for near-time visibility on a large-scale. This has become far more important than sub-second correlation of known attack vectors against discrete sets of network devices.

“NIST perspective: Continuing serious cyber attacks on federal information systems, large and small; targeting key federal operations and assets. Attacks are organized, disciplined, aggressive, and well resourced; many are extremely sophisticated. Adversaries are nation states, terrorist groups, criminals, hackers, and individuals or groups with intentions of compromising federal information systems.”

Andy went on to discuss how the effective deployment of malicious software causing significant exfiltration of sensitive information (including intellectual property) and potential for disruption of critical information systems/services has made detection of inforation and data leakage a key government and enterprise security requirement.

Bob Flores, Former CTO and 31 year veteran of the CIA was our next speaker. Bob retired from the CIA six months ago and is now President and CEO of Applicology, providing cyber security and IT strategy consulting services. In his 31 years at the CIA, he held various positions in the Directorate of Intelligence, Directorate of Support, and the National Clandestine Service. Most recently he was the CIA’s CTO where he was responsible for ensuring that the Agency’s technology investments matched the needs of its many missions. Bob has a Bachelor and Master of Science degrees in Statistics from Virginia Tech.

Quis custodiet ipsos custodes?

Brush up on your Latin! “Who’s guarding the guards” was the topic of Bob’s talk. Insider threat in an every changing threat landscape was and remains our number one cyber security risk.

“Defense-in-depth isn’t just about putting adequate technology in place, it’s also about paying attention to your people and implementing policies and procedures to reduce the likelihood of an insider attack.”
Dawn Cappell, CERT

The simple but not so obvious model Bob pursued at the CIA was an extension of the ISO stack to include the non-technical but motivational additions.


We need to worry about all levels of the stack including layers eight and nine because we all have people messing around at various layers with applications, scripts, communications etc. And their motivation is often very clear.

Nemo repente fuit turpissimus! Or no one ever became thoroughly bad in one step!”

The point is people don’t just wake up one day and decide to be bad. They are motivated over time by larger causes and in EVERY CASE leave a trail of clues behind that can’t entirely be covered up.

What to Do?

According to Mr. Flores the focus needs to be on real-time visibility. You need visibility into who (or what) is perturbing your enterprise right now and over time. You can tediously review the logs of each device and user as the CIA used to do or you can take advantage of Splunk.

“Splunk may not be the best thing since sliced bread, but it’s pretty darn close.”
– Bob Flores

Why Splunk?

Why did the CIA choose Splunk over so many other security forensic solutions? It all comes down to how easily and scalable Splunk can eat any logs, events and messages Bob’s organization throws at it. Combine that with the real-time search, alert and reporting and over time statistics and analysis on

  • user behavior,
  • network behavior,
  • system and application activities and
  • configuration changes
  • user customizable dashboards to enforce who can see what about whom and full data segregation and access auditing by user or role and you have the answer.

    Our last guest speaker was David Duvall, Infrastructure Architect at Discovery Communications. David is a lead technical architect working with teams across four continents to build critical systems and keep them running. Discovery is one of my favorite cable channels. If you haven’t seen it the series entitled Man Versus Wild, is just awesome. I won’t spoil it for you. Check it out. Discovery the world’s number one non-fiction media company with more than 1.5 billion cumulative subscribers in over 170 countries. They run 100-plus worldwide networks, led by Discovery Channel, TLC, Animal Planet, Science Channel, Planet Green, Investigation Discovery and HD Theater. Yes all the good stuff that makes having a cable or satellite subscription service worth while.

    We’re Going Public!

    And oh we have just 16 Months to show SOX compliance. Discovery went public in September, 2008. The company knew they needed a log consolidation system for retention of at least 13 months worth of data with minimal time for rollout. They couldn’t spend a quarter implementing a new solution. The in-scope SOX environment includes

    • 50 domain servers on 4 continents,
    • Unix syslog,
    • WebSphere app server logs,
    • Client desktop logs,
    • Network backup status logs,
    • WMI Windows event logs,
    • Cisco, Juniper and F5 network device logs,
    • NetApp filer logs and
    • Oracle database logs.

    Splunk Deployment

    Discovery’s Splunk deployment took 1.5 weeks from start to finish. David was responsible for the installation and personally downloaded and installed Splunk, read the Splunk docs, wikis and got up and running without weeks of services. Most data sources are streamed to Splunk over the network from their native logging facilities.

    “I knew I could get Splunk up and running quickly to ensure I captured all the data. Then I could take my time to figure out what I wanted to do with the data.”

    Approximately 100 Windows servers were outfitted with Splunk light- weight forwarders to bring Windows event logs, native files and registry change information into Splunk. Oracle database logs are stored in SQL tables and David was able to set-up a scripted Splunk data input which acts like any other SQL client to grab the Oracle database logs on a scheduled basis.

    Compliance Reporting Made Easy

    Once the initial deployment was complete, David and turned his attention to working with the company’s SOX auditors and department heads to develop the compliance reports required to demonstrate compliance with all the necessary controls.

    “As the auditors questions change from week to week—it’s easy to pull new data and generate ad-hoc reports.”

    Using Splunk’s role-based access controls, David and the auditors then developed an implemented policies to guard the data and reports including audit reports to prove only the necessary individuals are using the information and to prove authenticity of the data itself. The auditors really like the secure audit trail and signing of data from source of origin all the way through to the Web-based control reports.

    Lessons Learned

    Adoption of Splunk proved easier than David and the audit team imagined because many of the IT team at Discovery had already downloaded and used Splunk for other tasks.

    “When you explain Splunk as “Index and Search” you’re glossing over a lot of the value. Dashboards that correlate failures from different sources and troubleshoot different environmental items are priceless.”

Splunk
Posted by

Splunk

Join the Discussion