Welcome!

@DevOpsSummit Authors: Pat Romanski, Elizabeth White, Yeshim Deniz, Zakia Bouachraoui, William Schmarzo

Related Topics: @DevOpsSummit, Containers Expo Blog, @CloudExpo

@DevOpsSummit: Article

What Is Containerization and Will It Spell the End for Virtualization? | @DevOpsSummit #DevOps #Containers

Containerization is disrupting the cloud, but what will be the implications for virtual machines?

What Is Containerization and Will It Spell the End for Virtualization?
By Ron Gidron

Containerization is popularly viewed as the ‘virtualization of virtualization' or ‘next generation virtualization.' However, containers have existed long before virtualization or the advent of modern container technology like Docker and Linux Containers. Similar tech was built into mainframe systems that pervaded the IT landscape for the preceding decades.

However, the implication, as the name suggests, is that modern software containerization will have the same seismic impact on the IT industry as shipping containers have had on maritime freight transport. Indeed, it is quite common now for many major online companies to run their entire infrastructure on containers.

The reason behind the analogy, which is even alluded to by Docker in their logo, is that in the same way shipping containers enabled for different products to be kept together when being transported, the software containers will enable all the different elements of an application to be bundled together and moved from one machine to another with comparative ease. In essence, they become extremely lightweight and portable.

Containerization Fundamentals
Containerization enables you to run an application in a virtual environment by storing all the files, libraries, etc., together as one package - a container.  The container can plug directly into the operating system kernel and does not requires you to create a new virtual machine every time you want a new instance of the application, or to run any other application that uses the same O/S. Keeping the entire application together means different services can efficiently share the operating system kernel.

The rise to prominence of containerization is largely attributable to the development of the open source software, Docker. While there were other container technologies previously available, Docker has brought separate workflows for Linux, Unix and Windows. The Docker engine, for example, enables the application to become usable on any machine. With the application bundled in isolation, it can easily be moved to a different machine or operating system as required.

How Is It Different from Virtual Machines?
In contrast to containerization, a virtual machine requires you to run both a hypervisor and a guest operating system. So every time you wish to fire up your application you are required to install a new operating system. This can create a number of challenges in terms of:

  • Portability - it becomes difficult to move the application to another virtual machine
  • Speed - accessibility and setup times can be significant
  • Resources - virtual machines take up significantly more space than containers

Evidently it is possible to support far more containers than virtual machines on the same level of infrastructure. By enveloping the entire application in its own operating system, a virtual machine brings a lot more overheads.

Tech sprawl also becomes an issue for virtual machines, because if the O/S is modified or updated in one place, it will need to be manually done so everywhere else. Obviously such a problem does not exist in containerization, which again saves time and money.

Is This the End of Virtualization?
No.
Virtual machines are heavily integrated into the landscape of many major enterprises and the idea of just dumping existing applications into a container is impractical. The architecture needs to be redesigned or containerization simply won't work.

However, there are several advantages to virtual machines and these go beyond the necessary support of legacy applications. Large scale organizations are extremely heterogeneous, suffering from a sprawl of technology across a number of different operating systems with different modifications. Furthermore, the virtual machines still have a role in enabling large scale data center infrastructure as they encapsulate bare metal servers.

Virtualization, and specifically the hypervisor, provide effective partitioning of the different operating systems on the server. Obviously with containerization, each server requires the same O/S, so whereas newer companies were able to foresee such problems early on, for larger established enterprises this privilege does not exist.

Ultimately containerization is very much here to stay and offers a range of benefits to adopters. The increases in speed, portability and flexibility it offers will see a reduction in the prominence of virtual machines. However, they will still have a role in the future of IT, specifically within large or technically diverse organizations.

More Stories By Automic Blog

Automic, a leader in business automation, helps enterprises drive competitive advantage by automating their IT factory - from on-premise to the Cloud, Big Data and the Internet of Things.

With offices across North America, Europe and Asia-Pacific, Automic powers over 2,600 customers including Bosch, PSA, BT, Carphone Warehouse, Deutsche Post, Societe Generale, TUI and Swisscom. The company is privately held by EQT. More information can be found at www.automic.com.

@DevOpsSummit Stories
Atmosera delivers modern cloud services that maximize the advantages of cloud-based infrastructures. Offering private, hybrid, and public cloud solutions, Atmosera works closely with customers to engineer, deploy, and operate cloud architectures with advanced services that deliver strategic business outcomes. Atmosera's expertise simplifies the process of cloud transformation and our 20+ years of experience managing complex IT environments provides our customers with the confidence and trust that they are being taken care of.
Intel is an American multinational corporation and technology company headquartered in Santa Clara, California, in the Silicon Valley. It is the world's second largest and second highest valued semiconductor chip maker based on revenue after being overtaken by Samsung, and is the inventor of the x86 series of microprocessors, the processors found in most personal computers (PCs). Intel supplies processors for computer system manufacturers such as Apple, Lenovo, HP, and Dell. Intel also manufactures motherboard chipsets, network interface controllers and integrated circuits, flash memory, graphics chips, embedded processors and other devices related to communications and computing.
Darktrace is the world's leading AI company for cyber security. Created by mathematicians from the University of Cambridge, Darktrace's Enterprise Immune System is the first non-consumer application of machine learning to work at scale, across all network types, from physical, virtualized, and cloud, through to IoT and industrial control systems. Installed as a self-configuring cyber defense platform, Darktrace continuously learns what is ‘normal' for all devices and users, updating its understanding as the environment changes.
As you know, enterprise IT conversation over the past year have often centered upon the open-source Kubernetes container orchestration system. In fact, Kubernetes has emerged as the key technology -- and even primary platform -- of cloud migrations for a wide variety of organizations. Kubernetes is critical to forward-looking enterprises that continue to push their IT infrastructures toward maximum functionality, scalability, and flexibility. As they do so, IT professionals are also embracing the reality of Serverless architectures, which are critical to developing and operating real-time applications and services. Serverless is particularly important as enterprises of all sizes develop and deploy Internet of Things (IoT) initiatives.
The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential. DevOpsSUMMIT at CloudEXPO expands the DevOps community, enable a wide sharing of knowledge, and educate delegates and technology providers alike.