Welcome!

@DevOpsSummit Authors: Liz McMillan, Pat Romanski, Zakia Bouachraoui, Yeshim Deniz, Gopala Krishna Behara

Related Topics: @DevOpsSummit, @CloudExpo, Cloud Security

@DevOpsSummit: Blog Feed Post

How Does DevOps Fit with ITIL? | @DevOpsSummit #DevOps

DevOps simply represents the better fusion of the usually quite distinct departments of software engineering and IT operations

How Does DevOps Fit with ITIL?
By Neil McEvoy

DevOps simply represents the better fusion of the usually quite distinct departments of software engineering and IT operations. The goal is faster and safer rates of software innovation.

A simple objective but a troublesome one in reality – as many experts explain the core issue is they are directly opposed: One is tasked with creating as much software change as is possible, and the other tasked with minimizing change as much as possible. Even the smallest full stop in a URL in a line of code can break an entire system and deploying complex enterprise applications multiplies this risk factor my many magnitudes.

As DevOps guru Gene Kim describes in this article on the integration of DevOps with ITIL:

DevOps aims to address a core, chronic conflict that exists in almost every IT organization.  It is so powerful that it practically pre-ordains horrible outcomes, if not abject failure. The problem? The VP of Development is typically measured by feature time to market, which motivates as many changes, as quickly as possible.  On the other hand, the VP of IT Operations is typically measured by uptime and availability.

The discussion of the fit with ITIL is a perfect starting point for building an Enterprise DevOps, as the practices are well established and well understood across many large organizations.

Gene makes the critical point: Rather than being competitive these are actually two halves of the same jigsaw. Gene describes DevOps as more ‘agile philosophy + practical tools like Puppet’ without a formal documentation base, versus ITIL which is entirely this. Complaints of ITIL are that it is too ‘heavy’, too bureaucratic, and so ultimately the DevOps movement simply represents an extension of ITIL practices – Being able to better achieve the processes it describes through automation tools described for this purpose.

The post How does DevOps fit with ITIL? appeared first on Cloud Best Practices.

Read the original blog entry...

More Stories By Cloud Best Practices Network

The Cloud Best Practices Network is an expert community of leading Cloud pioneers. Follow our best practice blogs at http://CloudBestPractices.net

@DevOpsSummit Stories
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the public cloud best suits your organization, and what the future holds for operations and infrastructure engineers in a post-container world. Is a serverless world inevitable?
Wooed by the promise of faster innovation, lower TCO, and greater agility, businesses of every shape and size have embraced the cloud at every layer of the IT stack – from apps to file sharing to infrastructure. The typical organization currently uses more than a dozen sanctioned cloud apps and will shift more than half of all workloads to the cloud by 2018. Such cloud investments have delivered measurable benefits. But they’ve also resulted in some unintended side-effects: complexity and risk. End users now struggle to navigate multiple environments with varying degrees of performance. Companies are unclear on the security of their data and network access. And IT squads are overwhelmed trying to monitor and manage it all.
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogger and is a frequent speaker on the use of Big Data and data science to power the organization's key business initiatives. He is a University of San Francisco School of Management (SOM) Executive Fellow where he teaches the "Big Data MBA" course. Bill was ranked as #15 Big Data Influencer by Onalytica. Bill has over three decades of experience in data warehousing, BI and analytics. He authored E...
When building large, cloud-based applications that operate at a high scale, it's important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. "Fly two mistakes high" is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed how this same philosophy can be applied to highly scaled applications, and can dramatically increase your resilience to failure.