Welcome!

@DevOpsSummit Authors: Yeshim Deniz, Zakia Bouachraoui, Pat Romanski, Liz McMillan, Elizabeth White

Related Topics: @DevOpsSummit, Containers Expo Blog, @CloudExpo

@DevOpsSummit: Article

What Is Containerization and Will It Spell the End for Virtualization? | @DevOpsSummit #DevOps #Containers

Containerization is disrupting the cloud, but what will be the implications for virtual machines?

What Is Containerization and Will It Spell the End for Virtualization?
By Ron Gidron

Containerization is popularly viewed as the ‘virtualization of virtualization' or ‘next generation virtualization.' However, containers have existed long before virtualization or the advent of modern container technology like Docker and Linux Containers. Similar tech was built into mainframe systems that pervaded the IT landscape for the preceding decades.

However, the implication, as the name suggests, is that modern software containerization will have the same seismic impact on the IT industry as shipping containers have had on maritime freight transport. Indeed, it is quite common now for many major online companies to run their entire infrastructure on containers.

The reason behind the analogy, which is even alluded to by Docker in their logo, is that in the same way shipping containers enabled for different products to be kept together when being transported, the software containers will enable all the different elements of an application to be bundled together and moved from one machine to another with comparative ease. In essence, they become extremely lightweight and portable.

Containerization Fundamentals
Containerization enables you to run an application in a virtual environment by storing all the files, libraries, etc., together as one package - a container.  The container can plug directly into the operating system kernel and does not requires you to create a new virtual machine every time you want a new instance of the application, or to run any other application that uses the same O/S. Keeping the entire application together means different services can efficiently share the operating system kernel.

The rise to prominence of containerization is largely attributable to the development of the open source software, Docker. While there were other container technologies previously available, Docker has brought separate workflows for Linux, Unix and Windows. The Docker engine, for example, enables the application to become usable on any machine. With the application bundled in isolation, it can easily be moved to a different machine or operating system as required.

How Is It Different from Virtual Machines?
In contrast to containerization, a virtual machine requires you to run both a hypervisor and a guest operating system. So every time you wish to fire up your application you are required to install a new operating system. This can create a number of challenges in terms of:

  • Portability - it becomes difficult to move the application to another virtual machine
  • Speed - accessibility and setup times can be significant
  • Resources - virtual machines take up significantly more space than containers

Evidently it is possible to support far more containers than virtual machines on the same level of infrastructure. By enveloping the entire application in its own operating system, a virtual machine brings a lot more overheads.

Tech sprawl also becomes an issue for virtual machines, because if the O/S is modified or updated in one place, it will need to be manually done so everywhere else. Obviously such a problem does not exist in containerization, which again saves time and money.

Is This the End of Virtualization?
No.
Virtual machines are heavily integrated into the landscape of many major enterprises and the idea of just dumping existing applications into a container is impractical. The architecture needs to be redesigned or containerization simply won't work.

However, there are several advantages to virtual machines and these go beyond the necessary support of legacy applications. Large scale organizations are extremely heterogeneous, suffering from a sprawl of technology across a number of different operating systems with different modifications. Furthermore, the virtual machines still have a role in enabling large scale data center infrastructure as they encapsulate bare metal servers.

Virtualization, and specifically the hypervisor, provide effective partitioning of the different operating systems on the server. Obviously with containerization, each server requires the same O/S, so whereas newer companies were able to foresee such problems early on, for larger established enterprises this privilege does not exist.

Ultimately containerization is very much here to stay and offers a range of benefits to adopters. The increases in speed, portability and flexibility it offers will see a reduction in the prominence of virtual machines. However, they will still have a role in the future of IT, specifically within large or technically diverse organizations.

More Stories By Automic Blog

Automic, a leader in business automation, helps enterprises drive competitive advantage by automating their IT factory - from on-premise to the Cloud, Big Data and the Internet of Things.

With offices across North America, Europe and Asia-Pacific, Automic powers over 2,600 customers including Bosch, PSA, BT, Carphone Warehouse, Deutsche Post, Societe Generale, TUI and Swisscom. The company is privately held by EQT. More information can be found at www.automic.com.

@DevOpsSummit Stories
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Addteq is a leader in providing business solutions to Enterprise clients. Addteq has been in the business for more than 10 years. Through the use of DevOps automation, Addteq strives on creating innovative solutions to solve business processes. Clients depend on Addteq to modernize the software delivery process by providing Atlassian solutions, create custom add-ons, conduct training, offer hosting, perform DevOps services, and provide overall support services.
Contino is a global technical consultancy that helps highly-regulated enterprises transform faster, modernizing their way of working through DevOps and cloud computing. They focus on building capability and assisting our clients to in-source strategic technology capability so they get to market quickly and build their own innovation engine.
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addresses many of the challenges faced by developers and operators as monolithic applications transition towards a distributed microservice architecture. A tracing tool like Jaeger analyzes what's happening as a transaction moves through a distributed system. Monitoring software like Prometheus captures time-series events for real-time alerting and other uses. Grafeas and Kritis provide security polic...
DevOpsSUMMIT at CloudEXPO will expand the DevOps community, enable a wide sharing of knowledge, and educate delegates and technology providers alike. Recent research has shown that DevOps dramatically reduces development time, the amount of enterprise IT professionals put out fires, and support time generally. Time spent on infrastructure development is significantly increased, and DevOps practitioners report more software releases and higher quality. Sponsors of DevOpsSUMMIT at CloudEXPO will benefit from unmatched branding, profile building and lead generation opportunities.