From South Africa to Oslo & shipping to life insurance to sensor data. SplunkLive EMEA customer round up.

CT2It has been a busy few weeks for Splunk EMEA with eight SplunkLive events in Cape Town, Johannesburg, Frankfurt, Vienna, Oslo, Copenhagen, Stockholm and Amsterdam. There have been close to a thousand people hearing some great customer stories about how organisations use Splunk to get operational intelligence across a huge range of different industries. I was lucky enough to be at six of them and thought it would be worth sharing some of the stories from across the region.



It has been a bit of a flying visit to each country with the usual plane-airport-taxi-hotel-presentation-taxi-airport-plane but the range of use cases for Splunk, the many different industries and the different kinds of value the organisations are getting from their data made for a really busy but rewarding few weeks.


South Africa

We started the customer presentations with Rudi Pretorius who has been a Chief Information Security Officer for a number of the leading FS organisations in South Africa. He explained how to use Splunk to “go beyond a SIEM” and the African perspective on having to secure and operate mission critical financial services systems. It was interesting to see Hadoop as one of the data sources they managed and monitored with Splunk. It is always fascinating to see what customers say about the Splunk products and the benefits they get from it – everyone has a different story. To quote Rudi, “the biggest benefit of Splunk is the ease of customization to get the intelligence you really need”. He ended his presentation with a question and challenge to the audience: “Talk to the business – ask them what they want from their data”.

Next up was Paul Gilowey, a Foundation Technology Specialist from Santam, one of the biggest life insurer providers in South Africa. Paul gave one of the best presentations I’ve seen on logging strategy, how to get it right, how to get developer buy in and how to find the right champion to turn logging into something with business value. Based on his experience, Paul’s written the 8 steps to a successful Splunk implementation – it is well worth the read. It was interesting to hear about the use of Splunk in life insurance together with the popular insurance package Guidewire.  You can see Paul’s presentation on Slideshare here –




Next up was a tour of the Nordics. We started our first ever SplunkLive Copenhagen with a great turn out and some brilliant customer stories. We were lucky enough to have Tobias Gustavsson, Scrum Master from the DevOps team of Sweden’s largest fashion retailer, talking about how they use Splunk for online retail. He spoke about who uses Splunk:

  • Development – incident investigation & product testing
  • Business analysts – visibility into transaction volumes and customer intelligence
  • Customer support – targeting customers who’ve had issues with apology/voucher
  • Operations – web performance monitoring & incident investigation

The business team at the retailer use insights from Splunk to analyse shopping cart abandonment and make decisions on how to build and improve the site, manage stock levels etc.

Finally, Tobias explained how the operational intelligence they achieved with Splunk gave them an ROI within five minutes:


After Tobias, we had Peter Almqvist, Head of IT Operations and Per Hamrin, Systems Developer from Avanza Bank present. Avanza’s customers have been voted “most satisfied savers in Sweden” for the last four years and are also number one in stock market transactions in Sweden. Their online, multi-channel banking platform is core to their business and they recently moved from a traditional three-tier architecture to an in-memory computing model to be able to deal with their growth, increased customer demand and the need to deliver leading customer experience. They use Splunk for troubleshooting, business analytics around new product/feature adoption and availability/performance monitoring. Since the adoption of Splunk, Avanza have benefitted from improved availability (core to customer satisfaction), the ability to solve problems twice as fast, reduced development time thanks to DevOps insight and an enhanced customer support function.


Last, but by no means least, we had Maersk, one of the world’s largest shipping and logistic companies, as well as oil and gas. We were privileged to have Carsten Neubert, Head of Maersk’s Information Security Monitoring Center. Maersk was faced with the challenge of an increasing amount of security data, new kinds of unstructured security information and the need to cover all their individual business units. They take a three-pronged approach to security monitoring – external monitoring, security device monitoring and monitoring of threat feeds. They use Splunk to identify notable security events but also to assess the impact of these incidents and what the response should be. Through the use of Splunk they can now correlate across many types of security device. To quote Carsten, “before Splunk, any correlation across our infrastructure had to be done manually. It took a long time. That time has now been reduced to nothing.” The Maersk CISOs have been empowered through the use of Splunk dashboards and have better insight into the business’ security posture through access to their own data via a dedicated console.



On next to Oslo, where last year I confessed my love of the band A-ha (when I was growing up – I’m obviously much cooler now…). First on stage was Ruben Olsen from a major Norwegian financial institution. They have over 2 million private customers and 200,000 corporate customers. Their online banking services have over a million users. When they went live in 2011, from their extensive application portfolio, they were generating 1,500,000,000 log events per day (that’s a lot of zeros) from over 2000 log files. They had a lot of data in a lot of different places. They use Splunk to:

  • Enhance software quality
  • Search and monitor  their machine data in real time
  • Investigate incidents
  • Monitor Application Servers
  • Improve QA before deployment to production

There was an interesting footnote to the presentation about how to best work with an outsourcer and make sure you keep the operation of Splunk under your control but also work collaboratively with the service provider to make sure they use Splunk for their operations. You can find Ruben’s presentation on Slideshare here.

Carrying on the theme of insurance and Financial Services, next up was Rune Hellem, Senior IT Operations Consultant from KLP, which is Norway’s largest life insurance company. KLP started using Splunk like a lot of customers – with troubleshooting. They then expanded this to more proactive monitoring of their IT estate. Finally their use has grown and they are now looking at how Splunk could be used for auditing and compliance. KLP are taking data from Websphere, Filenet, IBM Process Server, WebLogic, JIRA, .Net servers and their content management system. Both development and IT ops use Splunk and the power of the Splunk alerting capabilities meant they could save money and reduce complexity by retiring other monitoring systems.

Some of the benefits that KLP have got from using Splunk include reducing incident troubleshooting by at least an hour, reducing the impact of any incidents on users and the bottom line, alerts now notify CSC if there’s an issue and the visibility they have now means they aren’t blamed for issues that aren’t their responsibility. One of the top tips they gave the audience was particularly interesting – “always refer to Splunk when sharing information”. We see this a lot with Splunk – it does provide a great “system of record” of what is actually happening (or has happened) and it is very easy to share and use the Splunk search language to collaborate over any particular issues.





Last in the mini Nordic tour was Stockholm, one of my favourite cities. We had Avanza Bank and Tobias speak for us again and they were joined by Magnus Norling from a leading online gaming company. The big challenge Magnus’ organisation faced was a lack of visibility. They started using Splunk for PCI compliance but quickly grew their adoption of Splunk to deliver analytical dashboards and proactive monitoring as part of their 24/7 NOC. They manage around 1,200,000,000 events every day in Splunk. Magnus described their situation with an entertaining quote from “The Art of War”:

“Superior commanders succeed in situations where ordinary people fail because they obtain more timely information and use it more quickly.”

Magnus then went on to explain how operational intelligence is key to driving real-time insight from machine and business data and how this is critical in making faster and smarter decisions across an organisation.



This week started with the final stage of the tour to Amsterdam. The first presenter was Bas Zimmermann, a Technical Project Leader with Stedin, who was talking about how they use Splunk for renewable energy, smart metering and Internet of Things data from sensors. It was one of the first presentations where I’d seen a customer really tackled the challenge of combining IoT data, business data and security information to give a complete view of operational intelligence around sensor data. Through the use of Splunk and the machine data, Stedin’s IT team were able to better collaborate with the business thanks to the insights, dashboards and data correlation they could get from the different silos of information.



After Bas was Gert Kremer, a Mission Critical Engineer with Schuberg Philis, a leading managed service and cloud provider in The Netherlands. They are using Splunk as part of their multi-tenant managed service offering. They faced a number of challenges supporting their growing customer base including a lack of flexibility and adaptability in home grown tools, a growing number of sources of information they needed to analyse issues, increasing data volumes, a challenge around ensuring security and the need to centralise monitoring. To add to these challenges, they needed to ensure this service across two data centres that are active/active – they are using the multi-sire clustering features of Splunk 6.1 to good effect. The cloud solution deployment also needed to be automated through the use of configuration management tools such as Chef and customers needed self-service access through a portal.

Schuberg Philis have a built a highly effective architecture to deliver their managed cloud offerings. Splunk plays a key part in assuring the quality of service and making sure that some of the region’s biggest customers get a high quality managed cloud service.


Many thanks to all the awesome customer speakers. Apologies for the length of the post if you’ve got this far – there are a lot of customer stories to tell!

It is a few days back in the UK and then off to Las Vegas for .conf2014 – keep an eye out next week for some very, very exciting news and even more customer stories. If you’re going, then see you there.

As always – thanks for reading….



Matt Davies
Posted by

Matt Davies

Matt is Splunk's VP, Customer Marketing (and part time Chief Colouring-In Officer). He's working closely with Splunk customers to help them understand the value that new insights from machine data can deliver to their business. Matt is also one of Splunk's technical evangelists and communicates Splunk's go to market strategy in the region. Previously Matt has worked at Cordys, Oracle/BEA, Elata, Broadquay Consulting, iPlanet/Sun, Netscape and IBM. With nearly 25 years in the software industry, Matt has extensive knowledge of enterprise IT systems.

Join the Discussion