IT IT

Three Ways Data-Driven DevOps Drives Business Success

Fast feedback loops typify high performers in DevOps. But you must have the right data to make decisions, otherwise the feedback is all noise and no signal. Here are three key benefits of using data in your DevOps workflows. Or just check out the interactive video below.

1.     Increased Velocity

Data analytics from the delivery pipeline enables DevOps teams to detect and eliminate slowdowns and bottlenecks, so they can deliver applications faster.

To improve velocity in DevOps, it’s important to understand the end-to-end application delivery lifecycle and map the time and effort in each phase—this helps you detect and eliminate gaps and “waste” in the overall system. In DevOps, such insight is all available as data that’s pulled directly and continuously from the tools and systems in the delivery lifecycle.

Planning tools, code repositories, version control tools, build tools, configuration tools, deployment and release tools… the list goes on! All report in their phases when releases are created and by whom, how long they take in each phase, whether that phase is successful, and a lot more. Yet none of these tools shows the whole story.

By analyzing this data in aggregate (in the Splunk platform for example), DevOps teams benefit from an actionable view of the end-to-end application delivery value stream, both real-time and historical. You can use this data to streamline or eliminate the issues that are slowing your process down, and also enable continuous improvement in delivery cycle time.

2.     Better Quality Code

Data analytics from test and QA enables DevOps teams to see all the quality issues in new releases, and remediate before release, not after.

In traditional application delivery models, QA was largely manual and ensuring higher quality was a challenge. In modern DevOps environments, much (or even all) of the QA process is achieved with automated testing tools. However, different tools are typically used for “white box” or static analysis to understand code security, dependencies, comments, policy, quality, and compliance testing; and for “black box” or dynamic analysis for functionality, regression, performance, resilience, and penetration testing, and for meta-analysis like code coverage, test duration, etc.

Again, none of these tools show the whole story, but analyzing the aggregate data enables DevOps teams to make faster and better decisions about overall application quality, even across multiple QA teams and tools. This data can even be fed into further automation to, for example, route failing code back to the developer or squad who contributed it, trigger a code review with a different developer or team, push it forward into staging/pre-prod, all the way into provisioning and release.

This ability to “shift left” with QA enables rapid go/no-go decisions based on real world data, and also dramatically improves the quality of the code that does make it into production, by ensuring failing or poor quality code never makes it in front if your customers. 

3.     Stronger Business Impact

Data analytics from real-world customer experience enables DevOps teams to reliably connect application delivery with business goals.

It’s also critical to connect IT and application delivery with business data. While IT needs data on speeds and feeds, the business needs data on the impact of new releases on metrics like user signups/cancellations, cart fulfillment/abandonment, and especially revenue. No one source provides a complete view of this data, as it’s isolated across applications, middleware, web servers, mobile devices, wire data, POS systems, external APIs, and more.

But analyzing the aggregate data to generate business-relevant impact metrics enables DevOps teams to try new things, innovate in increments, see the results, compare with business goals, and iterate quickly – pulling back from “bad” releases and doubling down on “good” ones. This is the key behind “fail fast, fail small, fail cheap” – a core tenet behind successful innovation.

Individually, many different tools all tell a part of the story. Only by analyzing data from all these tools can you get to a truly end-to-end, data-driven approach to application delivery. This ability to see the impact of your work in context of the larger application delivery system makes an incredible difference to the effectiveness of DevOps teams. Learn more or find me on Twitter.

Andi Mann
Posted by Andi Mann

Andi Mann, Chief Technology Advocate at Splunk, is an accomplished digital business executive with extensive global expertise as a strategist, technologist, innovator, marketer, and communicator. For over 30 years across five continents, Andi has built success with startups, enterprises, vendors, governments, and as a leading research analyst. Andi has been named to multiple ‘Top …’ lists and is the co-author of two popular books, 'Visible Ops – Private Cloud' and and 'The Innovative CIO'.

Join the Discussion