Welcome!

@DevOpsSummit Authors: Elizabeth White, Yeshim Deniz, Pat Romanski, Aruna Ravichandran, Liz McMillan

Related Topics: @DevOpsSummit, Java IoT, @CloudExpo

@DevOpsSummit: Blog Post

Sparking True Software-Based Business Innovation By @Mik_Kersten | @DevOpsSummit #DevOps

The software industry is starting to go through a transformation in our understanding of the use of metrics

I've long taken inspiration from Peter Drucker's caution that "if you can't measure it, you can't manage it." It's no wonder that leaders of business disciplines such as HR, finance, marketing and sales consistently use metrics, reports and dashboards to assess their business and aid in decision-making.

The software industry is starting to go through a similar profound transformation in our understanding of the use of metrics to better understand the business implications of how software is built. But before this transformation can succeed, we have to work through some pretty big shortcomings in how we determine the types of insights we wish to receive, how we select the analytics necessary to receive those insights and how to capture and manipulate the data to present those analytics.

Exactly 40 years have passed since Fredrick Brooks cautioned that measuring software in terms of man-months was a bad idea. Building software is a complex undertaking with intricate interactions and nuances. This complexity makes the notion of measuring its progress simply in terms of hours of human effort plainly ridiculous. Most everyone in the software industry who has read that book agrees with its premise. Yet so many in the software industry are still measuring software delivery in terms of man-months, FTEs and equivalent cost metrics - and they are as misleading as Brooks predicted.

But if we in the software industry know these metrics are a poor substitute for the "correct" set of management data, why do we continue to use them? There are several factors at play, many of which are underscored by an all-too-common analytical approach: Often, when starting any kind of data reporting project, we start by looking at the available data and figuring out what to do with it. We look at this data to see if we can manipulate it in some way (looking at averages, plotting out trends, etc.) to derive interesting information.

Yet this is precisely the wrong way to go about setting up a metrics program. The more effective way to find actual insights is to first understand what is important to the business and only then determine how to create the analytics required to illuminate those insights. The final step is to figure out how to get your hands on the data.

For example, if all we have in front of us is data about our defects, it should be fairly straightforward to determine things like: How quickly are they being fixed when categorized by severity? Has the velocity of our bug fixing changed over time?

This information is certainly worthwhile, but is it "insightful" information?

Some of the most interesting insights transcend the individual disciplines within software development. For example: What types of features result in the greatest number of customer complaints? What parts of the codebase result in the greatest number of defects? What enhancements provided the greatest increase in revenue?

To create these analytics, an organization has to create a business intelligence (BI) infrastructure for software development. This infrastructure pulls data from various tools used by the extended software development and delivery team, manipulates that data, and renders it in a report or data visualization that allows managers to easily understand what they are seeing.

Over the past year I've had the benefit of meeting face-to-face with IT leaders in over 100 different large organizations and having them take me through how they're approaching the problem of getting their hands on the relevant information. The consistent theme is that to establish meaningful measurement for software delivery, we need the following layers of tools, where each layer is supported by the layer below it:

1) Presentation layer

  • Report generation
  • Interactive visualization
  • Dashboards and wallboards
  • Predictive analytics

2) Metrics layer

  • Business value of software delivery
  • Efficiency of software delivery

3) Data storage layer

  • Historical cross-tool data store
  • Data warehouse or data mart

4) Integration infrastructure layer

  • Normalized stream of data and schema updates from each tool

The presentation layer (1) is not the problem. This is a mature space filled with innovative new offerings, as well as the myriad of hardened enterprise BI tools. What these generic tools lack is any domain understanding of software delivery. This is where the need for innovation on the metrics layer (2) comes in.

Efforts in establishing software delivery metrics have been around as long as software itself, but given the vendor activity around them and the advances being made on lifecycle automation and DevOps, I predict that we are about to go through an important round of innovation on this front. Combining software delivery metrics with business value metrics is an even bigger opportunity, and one where the industry has barely scratched the surface.

Some forward-thinking IT leaders are already correlating business metrics, such as sales and marketing data, with some basic software delivery measures. While a lot of innovation is left on this front, it's clear that the way that the data is manifested in the storage layer (3) must support both the business and the software delivery metrics.

Thanks to the huge investment that vendors and VCs are making in big data technology, the data storage layer (3) has a breadth of great commercial and open source options to choose from, Of course, the one that's most appropriate depends on the existing data warehouse/mart investment that's in place, as well as the kind of metrics that the organization is after. For example, efficiency trend metrics can lend themselves best to a time-series based MongoDB, while a traditional relational database can suffice for compliance reports.

Many of the leaders I've spoken with have already attempted to create end-to-end software lifecycle analytics, and they found that one of the biggest impediments to creating meaningful lifecycle metrics is the ability to "simply" get the data from the various tools being used across the entire software development and delivery, and put them in the data storage layer (3) for manipulation by the layers above.

This is the work of the integration infrastructure layer (4). In the past, this may have been achieved by ETL processes, with data being extracted directly from the database schemas of the individual tools. But that approach has not only proven brittle, but only provides data in batch processing intervals, not in real time. A more robust approach is to access the data through each tool's API.  Unfortunately, each vendor has its massive API set and highly customizable schemas; and process models and standards efforts, while important, are years away from sufficiently broad adoption.

This integration infrastructure layer has to provide the ability to connect to the various tools' data repositories, detect when changes are being made (to provide real-time data) and update the data storage layer. The data storage layer must reflect a unified view of all the artifacts across the SDLC, independent of the tools that originally created the data.

With this infrastructure in place, IT leaders will have access to a new set of data. They will have moved from a very narrow set of data that is captured, manipulated and reported on within the confines of individual tools, to data that is not only unified across the entire SDLC, but that can also span business units outside of software development.

When supplied with this kind of rich data set, the industry can bring together business leaders, software delivery thought-leaders and data scientists to finally go beyond the tactical software development reports of yesterday and move toward the cross-discipline business insights that are necessary to spark true software-based business innovation.

More Stories By Mik Kersten

Dr. Kersten is the CEO of Tasktop Technologies, creator and leader of the Eclipse Mylyn open source project, and inventor of the task-focused interface. His goal is to create the collaborative infrastructure to connect knowledge workers in the new world of software delivery. At Tasktop, Mik drives Tasktop’s strategic direction, key partnerships, and culture of customer-focused innovation. Prior to Tasktop, Mik launched a series of open source tools that changed the way software developers collaborate. As a research scientist at Xerox PARC, he created the first aspect-oriented development tools for AspectJ. He then created the task-focused interface during his PhD thesis and validated it with the release of Mylyn, now downloaded 2 million times per month. Building on the success of Mylyn, he created the Tasktop Dev and Sync product lines.

Mik's ideas on Application Lifecycle Management (ALM) and focus on individual knowledge worker needs make him a popular keynote speaker; he has been recognized with awards such as the JavaOne Rock Star and the IBM developerWorks Java top 10 writers of the decade. Mik's entrepreneurial contributions have been acknowledged by the 2012 Business in Vancouver 40 under 40, and as a World Technology Awards finalist in the IT Software category. Building on his contributions as one of the most prolific committers to Eclipse, he serves on the Eclipse Foundation's Board of Directors and web service standards bodies.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@DevOpsSummit Stories
SYS-CON Events announced today that MIRAI Inc. will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. MIRAI Inc. are IT consultants from the public sector whose mission is to solve social issues by technology and innovation and to create a meaningful future for people.
SYS-CON Events announced today that IBM has been named “Diamond Sponsor” of SYS-CON's 21st Cloud Expo, which will take place on October 31 through November 2nd 2017 at the Santa Clara Convention Center in Santa Clara, California.
SYS-CON Events announced today that TidalScale will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. TidalScale is the leading provider of Software-Defined Servers that bring flexibility to modern data centers by right-sizing servers on the fly to fit any data set or workload. TidalScale’s award-winning inverse hypervisor technology combines multiple commodity servers (including their associated CPUs, memory storage and network) into one or more large servers capable of handling the biggest Big Data problems and most unpredictable workloads.
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant that knows everything and can respond to your emotions and verbal commands!
Infoblox delivers Actionable Network Intelligence to enterprise, government, and service provider customers around the world. They are the industry leader in DNS, DHCP, and IP address management, the category known as DDI. We empower thousands of organizations to control and secure their networks from the core-enabling them to increase efficiency and visibility, improve customer service, and meet compliance requirements.
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, will describe how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launching of virtual storage services to its enterprise market.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lavi, a Nutanix DevOps Solution Architect, explored the ways that Nutanix technologies empower teams to react faster than ever before and connect teams in ways that were either too complex or simply impossible with traditional infrastructures.
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo Silicon Valley which will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. "DevOps is at the intersection of technology and business-optimizing tools, organizations and processes to bring measurable improvements in productivity and profitability," said Aruna Ravichandran, vice president, DevOps product and solutions marketing, CA Technologies. "It's this results-driven combination of technology and business that makes me so passionate about DevOps and its future in the industry. I am truly honored to take on this co-chair role, and look forward to working with the DevOps Summit team at Cloud Expo and attendees to advance DevOps."
Digital transformation is changing the face of business. The IDC predicts that enterprises will commit to a massive new scale of digital transformation, to stake out leadership positions in the "digital transformation economy." Accordingly, attendees at the upcoming Cloud Expo | @ThingsExpo at the Santa Clara Convention Center in Santa Clara, CA, Oct 31-Nov 2, will find fresh new content in a new track called Enterprise Cloud & Digital Transformation.
Most technology leaders, contemporary and from the hardware era, are reshaping their businesses to do software. They hope to capture value from emerging technologies such as IoT, SDN, and AI. Ultimately, irrespective of the vertical, it is about deriving value from independent software applications participating in an ecosystem as one comprehensive solution. In his session at @ThingsExpo, Kausik Sridhar, founder and CTO of Pulzze Systems, will discuss how given the magnitude of today's application ecosystem, tweaking existing software to stitch various components together leads to sub-optimal solutions. This definitely deserves a re-think, and paves the way for a new breed of lightweight application servers that are micro-services and DevOps ready!
SYS-CON Events announced today that mruby Forum will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. mruby is the lightweight implementation of the Ruby language. We introduce mruby and the mruby IoT framework that enhances development productivity. For more information, visit http://forum.mruby.org/.
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making real-time decisions based on a combination of real user monitoring, synthetic testing, APM, NGINX / local load balancers, and other data sources, is critical.
SYS-CON Events announced today that NetApp has been named “Bronze Sponsor” of SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. NetApp is the data authority for hybrid cloud. NetApp provides a full range of hybrid cloud data services that simplify management of applications and data across cloud and on-premises environments to accelerate digital transformation. Together with their partners, NetApp empowers global organizations to unleash the full potential of their data to expand customer touchpoints, foster greater innovation and optimize their operations.
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. That means serverless is also changing the way we leverage public clouds. Truth-be-told, many enterprise IT shops were so happy to get out of the management of physical servers within a data center that many limitations of the existing public IaaS clouds were forgiven. However, now that we’ve lived a few years with public IaaS clouds, developers and CloudOps pros are giving a huge thumbs down to the ...
Enterprises are adopting Kubernetes to accelerate the development and the delivery of cloud-native applications. However, sharing a Kubernetes cluster between members of the same team can be challenging. And, sharing clusters across multiple teams is even harder. Kubernetes offers several constructs to help implement segmentation and isolation. However, these primitives can be complex to understand and apply. As a result, it’s becoming common for enterprises to end up with several clusters. This leads to a waste of cloud resources and increased operational overhead.
SYS-CON Events announced today that Avere Systems, a leading provider of enterprise storage for the hybrid cloud, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Avere delivers a more modern architectural approach to storage that doesn't require the overprovisioning of storage capacity to achieve performance, overspending on expensive storage media for inactive data or the overbuilding of data centers to house increasing amounts of storage infrastructure.
Containers are rapidly finding their way into enterprise data centers, but change is difficult. How do enterprises transform their architecture with technologies like containers without losing the reliable components of their current solutions? In his session at @DevOpsSummit at 21st Cloud Expo, Tony Campbell, Director, Educational Services at CoreOS, will explore the challenges organizations are facing today as they move to containers and go over how Kubernetes applications can deploy with legacy components, and also go over automated capabilities provided by operators to auto-update Kubernetes with zero downtime for current and secure deployments.
SYS-CON Events announced today that Avere Systems, a leading provider of hybrid cloud enablement solutions, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Avere Systems was created by file systems experts determined to reinvent storage by changing the way enterprises thought about and bought storage resources. With decades of experience behind the company’s founders, Avere got its start in 2008 with a mission to use fast, flash-based storage in the most efficient, effective manner possible. What the team had discovered was a technology that optimized storage resources and reduced dependencies on sprawling storage installations. Launched as the Avere OS, this advanced file system not only boosted performance within standard, on-premises, network-attached storage systems but ...
Today most companies are adopting or evaluating container technology - Docker in particular - to speed up application deployment, drive down cost, ease management and make application delivery more flexible overall. As with most new architectures, this dream takes significant work to become a reality. Even when you do get your application componentized enough and packaged properly, there are still challenges for DevOps teams to making the shift to continuous delivery and achieving that reduction in cost and increase in speed. Sometimes in order to reduce complexity teams compromise features or change requirements
The next XaaS is CICDaaS. Why? Because CICD saves developers a huge amount of time. CD is an especially great option for projects that require multiple and frequent contributions to be integrated. But… securing CICD best practices is an emerging, essential, yet little understood practice for DevOps teams and their Cloud Service Providers. The only way to get CICD to work in a highly secure environment takes collaboration, patience and persistence. Building CICD in the cloud requires rigorous architectural and coordination work to minimize the volatility of the cloud environment and leverage the security features of the cloud to the benefit of the CICD pipeline.
SYS-CON Events announced today that SkyScale will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. SkyScale is a world-class provider of cloud-based, ultra-fast multi-GPU hardware platforms for lease to customers desiring the fastest performance available as a service anywhere in the world. SkyScale builds, configures, and manages dedicated systems strategically located in maximum-security facilities, allowing customers to focus on results while minimizing capital equipment investment.
As you move to the cloud, your network should be efficient, secure, and easy to manage. An enterprise adopting a hybrid or public cloud needs systems and tools that provide: Agility: ability to deliver applications and services faster, even in complex hybrid environments Easier manageability: enable reliable connectivity with complete oversight as the data center network evolves Greater efficiency: eliminate wasted effort while reducing errors and optimize asset utilization Security: implement always-vigilant DNS security
High-velocity engineering teams are applying not only continuous delivery processes, but also lessons in experimentation from established leaders like Amazon, Netflix, and Facebook. These companies have made experimentation a foundation for their release processes, allowing them to try out major feature releases and redesigns within smaller groups before making them broadly available. In his session at 21st Cloud Expo, Brian Lucas, Senior Staff Engineer at Optimizely, will discuss how by using new techniques such as feature flagging, rollouts, and traffic splitting, experimentation is no longer just the future for marketing teams, it’s quickly becoming an essential practice for high-performing development teams as well.
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.