Welcome!

@DevOpsSummit Authors: Liz McMillan, Elizabeth White, Pat Romanski, Jyoti Bansal, Yeshim Deniz

Related Topics: @DevOpsSummit, Linux Containers, Containers Expo Blog

@DevOpsSummit: Blog Feed Post

Automated versus Controlled By @DMacVittie | @DevOpsSummit #DevOps

Traditionally, the "review and approve" step is part and parcel of automation

In the currently high-flying world of DevOps, automation is the king. We are "automating all of the things" at a pretty astounding rate, and it is increasing productivity by reducing the man-hours invested in repetitive work. All very cool, but there is a dark side to automation, and that's control. While automating the entire datacenter is very cool, immature products sometimes lack the ability to let operators step in and approve changes before rolling them out.

Traditionally, the "review and approve" step is part and parcel of automation. New "just do it" tools are normally equipped with a review step to increase their appeal to datacenter teams during the verification and acceptance phase of rollout.

But DevOps got ahead of itself, and some vendors, wanting to move faster than competitors, started building their tools to take action by default, and not provide for that moment of human oversight. This probably works well for the majority of changes, but lacking that "brake", the train can get derailed. Outages in several large web providers that were cutting-edge DevOps shops certainly showed how that can happen. In one case I'm familiar with, the original error was a human error, but propagation happened so fast that before the human error could be fixed, thousands of customers were left with service interruptions, and cleaning up was slowed and even muddied by the automated systems re-propagating the misconfigurations.

So do we stop automating, and roll back DevOps? Absolutely not. But we should proceed with a bit of caution, and we should keep an eye on the ability of our tools to stop and let a human check things out. And of course, we need to make certain we have those check-points designed in, and there are responsible humans doing that checking.

The idea is simple - let the system do the work in an automated fashion, but keep operators in a position to give go/no-go decisions on the changes being made. That can be a range from "the system only does what it's told" - like the Stacki project, which I'm familiar with, that will install hundreds of machines at once, but install them the way humans (via spreadsheet normally, but also command line or automated tool) tell it to. Or via a "stop and review" step presented by a project like TerraForm by HashiCorp, which includes a "plan" step that allows operators to see exactly what steps will be taken to bring infrastructure to the desired state.

In the case of Stacki with spreadsheets, the desired configuration for hundreds or thousands of machines can be placed into the spreadsheet and reviewed by any number of people before it is applied. This offers teams the ability to make certain that major infrastructure build-outs or changes are solid and won't cause massive problems, while allowing automation to do all of the repetitive heavy lifting of configuring RAID and installing the operating systems. In the case of TerraForm, the "plan" step allows an experienced operator to quickly review what would be done and compare it to what they feel needs to be done before the changes are applied.

In both cases, automation is saving man-hours, but control remains in the hands of operations teams that need to insure the smooth execution of changes across the datacenter.

There are other tools (indeed, most tools have some capability, whether on by default or not) that allow for these and other types of checking, but the check-points need to be taken advantage of. Make certain staff is looking into the critical sections of automation and making certain that you are not merely increasing the velocity of destruction.

As most DevOps pros will tell you, planning ahead and including check-points in the process can help a lot both in protecting critical infrastructure and in easing the transition to an automated environment. Some projects will have corner cases and bugs that must be found, and check-points keep those problems from reaching production before they are fixed. In longer term projects, review only when the system has significant changes - or the system is introducing significant changes to the overall datacenter - is sufficient.

Automation and DevOps make systems overall more stable while also increasing agility (they're not mutually exclusive... A good automation system just requires good inputs to increase agility while maintaining uptime), so it just takes a little caution to make certain you're getting the most from the process. Some foresight in tool selection, and some more in the process determination, and you're set on a path that will save tons of man-hours without increasing overall risk to the organization. That's a good thing. Knowing what you want to check - just like knowing what you want to monitor - is the hard part, but some thought can bring you back with a list of high-risk activities that you need to check before pushing them out. Then add highly customized activities, like white-listing if including security in your DevOps environment, or configuring RAID controllers, and you've got your basic list of check-points.

Then go do cool stuff with all the free time it generates (after the initial investment, you don't get free time for free)... And keep kicking it.

Read the original blog entry...

More Stories By Don MacVittie

Don MacVittie is founder of Ingrained Technology, A technical advocacy and software development consultancy. He has experience in application development, architecture, infrastructure, technical writing,DevOps, and IT management. MacVittie holds a B.S. in Computer Science from Northern Michigan University, and an M.S. in Computer Science from Nova Southeastern University.

@DevOpsSummit Stories
Did you know that you can develop for mainframes in Java? Or that the testing and deployment can be automated across mobile to mainframe? In his session at @DevOpsSummit at 20th Cloud Expo, Vaughn Marshall, Sr. Principal Product Owner at CA Technologies, will discuss and demo how increasingly teams are developing with agile methodologies using modern development environments and automating testing and deployments, mobile to mainframe.
The goal of Continuous Testing is to shift testing left to find defects earlier and release software faster. This can be achieved by integrating a set of open source functional and performance testing tools in the early stages of your software delivery lifecycle. There is one process that binds all application delivery stages together into one well-orchestrated machine: Continuous Testing. Continuous Testing is the conveyor belt between the Software Factory and production stages. Artifacts are moved from one stage to the next only after they have been tested and approved to continue. New code submitted to the repository is tested upon commit. When tests fail, the code is rejected. Subsystems are approved as part of periodic builds on their way to the delivery stage, where the system is being tested as production ready. The release process stops when tests fail. The key is to shift test ...
SYS-CON Events announced today that Juniper Networks (NYSE: JNPR), an industry leader in automated, scalable and secure networks, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Juniper Networks challenges the status quo with products, solutions and services that transform the economics of networking. The company co-innovates with customers and partners to deliver automated, scalable and secure networks with agility, performance and value.
SYS-CON Events announced today that CA Technologies has been named "Platinum Sponsor" of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, New York, and 21st International Cloud Expo, which will take place in November in Silicon Valley, California.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In his Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, will explore the ways that Nutanix technologies empower teams to react faster than ever before and connect teams in ways that were either too complex or simply impossible with traditional infrastructures.
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory?
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists will examine how DevOps helps to meet the demands of Digital Transformation – including accelerating application delivery, closing feedback loops, enabling multi-channel delivery, empowering collaborative decisions, improving user experience, and ultimately meeting (and exceeding) business goals.
SYS-CON Events announced today that Hitachi, the leading provider the Internet of Things and Digital Transformation, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Hitachi Data Systems, a wholly owned subsidiary of Hitachi, Ltd., offers an integrated portfolio of services and solutions that enable digital transformation through enhanced data management, governance, mobility and analytics. We help global organizations open new revenue streams, increase efficiencies, improve customer experience and ensure rapid time to market in the digital age. Only Hitachi Data Systems powers the digital enterprise by integrating the best information technology and operational technology from across the Hitachi family of companies. We combine this experience with Hitachi expertise in the internet of things to d...
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTred processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
Cloud promises the agility required by today’s digital businesses. As organizations adopt cloud based infrastructures and services, their IT resources become increasingly dynamic and hybrid in nature. Managing these require modern IT operations and tools. In his session at 20th Cloud Expo, Raj Sundaram, Senior Principal Product Manager at CA Technologies, will discuss how to modernize your IT operations in order to proactively manage your hybrid cloud and IT environments. He will be sharing best practices around collaboration, monitoring, configuration and analytics that will help you boost experience and optimize utilization of your modern IT Infrastructures.
While some vendors scramble to create and sell you a fancy solution for monitoring your spanking new Amazon Lambdas, hear how you can do it on the cheap using just built-in Java APIs yourself. By exploiting a little-known fact that Lambdas aren’t exactly single threaded, you can effectively identify hot spots in your serverless code. In his session at 20th Cloud Expo, David Martin, Principal Product Owner at CA Technologies, will give a live demonstration and code walkthrough, showing how to overcome the challenges of monitoring S3 and RDS. This presentation will provide an overview of necessary Amazon Lambda concepts and discuss how to integrate the monitoring data with other tools.
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker containers gain prominence. He explored these challenges and how to address them, while considering how containers will influence the direction of cloud computing.
SYS-CON Events announced today that Hitachi, the leading provider the Internet of Things and Digital Transformation, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Hitachi Data Systems, a wholly owned subsidiary of Hitachi, Ltd., offers an integrated portfolio of services and solutions that enable digital transformation through enhanced data management, governance, mobility and analytics. We help global organizations open new revenue streams, increase efficiencies, improve customer experience and ensure rapid time to market in the digital age. Only Hitachi Data Systems powers the digital enterprise by integrating the best information technology and operational technology from across the Hitachi family of companies. We combine this experience with Hitachi expertise in the internet of things to d...
Cloud Expo, Inc. has announced today that Aruna Ravichandran, vice president of DevOps Product and Solutions Marketing at CA Technologies, has been named co-conference chair of DevOps at Cloud Expo 2017. The @DevOpsSummit at Cloud Expo New York will take place on June 6-8, 2017, at the Javits Center in New York City, New York, and @DevOpsSummit at Cloud Expo Silicon Valley will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that SoftLayer, an IBM Company, has been named “Gold Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. SoftLayer, an IBM Company, provides cloud infrastructure as a service from a growing number of data centers and network points of presence around the world. SoftLayer’s customers range from Web startups to global enterprises.
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.
Translating agile methodology into real-world best practices within the modern software factory has driven widespread DevOps adoption, yet much work remains to expand workflows and tooling across the enterprise. As models evolve from pockets of experimentation into wholescale organizational reinvention, practitioners find themselves challenged to incorporate the culture and architecture necessary to support DevOps at scale. In his session at @DevOpsSummit at 20th Cloud Expo, Anand Akela, Senior Director of DevOps Solutions at CA Technologies, will discuss how existing adopters are employing unified agile and DevOps techniques to engage functional processes and toolchains that deliver increased software quality, faster time-to-market and measurably improved customer experience.
Five years ago development was seen as a dead-end career, now it’s anything but – with an explosion in mobile and IoT initiatives increasing the demand for skilled engineers. But apart from having a ready supply of great coders, what constitutes true ‘DevOps Royalty’? It’ll be the ability to craft resilient architectures, supportability, security everywhere across the software lifecycle. In his keynote at @DevOpsSummit at 20th Cloud Expo, Jeffrey Scheaffer, GM and SVP, Continuous Delivery Business Unit at CA Technologies, will share his vision about the true ‘DevOps Royalty’ and how it will take a new breed of digital cloud craftsman, architecting new platforms with a new set of tools to achieve it. He will also present a number of important insights and findings from a recent cloud and DevOps study – outlining the synergies high performance teams are exploiting to gain significant busin...
SYS-CON Events announced today that T-Mobile will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. As America's Un-carrier, T-Mobile US, Inc., is redefining the way consumers and businesses buy wireless services through leading product and service innovation. The Company's advanced nationwide 4G LTE network delivers outstanding wireless experiences to 67.4 million customers who are unwilling to compromise on quality and value.
Everyone wants to use containers, but monitoring containers is hard. New ephemeral architecture introduces new challenges in how monitoring tools need to monitor and visualize containers, so your team can make sense of everything. In his session at @DevOpsSummit, David Gildeh, co-founder and CEO of Outlyer, will go through the challenges and show there is light at the end of the tunnel if you use the right tools and understand what you need to be monitoring to successfully use containers in your environments.
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in compute, storage and networking technologies, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and Embedded Systems worldwide. Supermicro is committed to protecting the environment through its “We Keep IT Green®” initiative and provides customers with the most energy-efficient, environmentally friendly solutions available on the market.
NHK, Japan Broadcasting, will feature the upcoming @ThingsExpo Silicon Valley in a special 'Internet of Things' and smart technology documentary that will be filmed on the expo floor between November 3 to 5, 2015, in Santa Clara. NHK is the sole public TV network in Japan equivalent to the BBC in the UK and the largest in Asia with many award-winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology and will be covering @ThingsExpo Silicon Valley. The program, to be aired during the peak viewership season of the year, will have a major impact on the industry in Japan. The film's director is writing a scenario to fit in the story in the next few days will be turned in to the network.
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor – all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organizations must focus on what is most relevant to deliver value, reduce IT complexity, create more repeatable agile-based processes and leverage increasingly secure and stable, cloud-based infrastructure platforms.
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal today!
Join IBM November 2 at 19th Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how to go beyond multi-speed it to bring agility to traditional enterprise applications. Technology innovation is the driving force behind modern business and enterprises must respond by increasing the speed and efficiency of software delivery. The challenge is that existing enterprise applications are expensive to develop and difficult to modernize. This often results in what Gartner calls "Bimodal IT," where business struggle to apply modern tools and practices to traditional monolithic applications. But these existing assets can be modernized and made more efficient without having to be completely overhauled. By leveraging methodologies like DevOps and agile, alongside emerging technologies like cloud-native services and containerization, traditional applications and teams can be ...