Welcome!

@DevOpsSummit Authors: Pat Romanski, Liz McMillan, Elizabeth White, Yeshim Deniz, SmartBear Blog

Related Topics: @DevOpsSummit, Java IoT, Linux Containers, Open Source Cloud, Containers Expo Blog, @CloudExpo, @DXWorldExpo

@DevOpsSummit: Blog Post

Interview: Docker and Containerization | @DevOpsSummit [#DevOps]

Company CEO Ben Golub on This, That, and The Matrix from Hell

Who doesn't like a clever turn of phrase? Docker CEO Ben Golub (pictured below) provided several from the recent Red Hat Summit in answering our questions about virtualization, Big Data, and DevOps.

With Cloud Expo now approaching, I see a lot of the big issues that will be raised there being raised here. Ben, take it away...

Roger: Could you explain for our audience the significance of containerization, and how it differs from virtualization?

Ben: Traditional virtualization was created over a decade ago, when applications were long-lived, monolithic, and deployed to a single server. In this world, when the problem to be solved was proliferation of single purpose physical servers - e.g. one server for Microsoft exchange, one server for Mac Print, one server for a custom Unix inventory -- it made sense to turn all of those single-purpose physical servers into single purpose virtual servers.

The VM was created, which takes an application measured in megabytes, combines it with a guest operating system measured in gigabytes, emulates disk etc., and creates a heavyweight, relatively static unit to run on top of a hypervisor.

Today, applications are short-lived and modified constantly. They are built from multiple loosely coupled components built on a multitude of stacks. And, they are deployed to large numbers of different servers.

Dockerization/containerization provides a much better alternative to virtualization for this kind of environment. Docker provides isolation, but in a lightweight format that runs directly on the host's operating system, and that can be easily modified or updated.

Docker enables containerized apps or components to work consistently together, work seamlessly across multiple different hosts, and do so with often 10x greater density than VMs. The same Docker container can be deployed, without modification, in milliseconds to a VM, to a bare metal server running RHEL, to a bare metal server running Ubuntu, to Amazon, to Rackspace, to an open stack cluster, etc.

So in my opinion, containerization and Docker have the potential to revolutionize how applications are built, managed, and deployed.

If you use an Android phone, you are already using containers. Every application on an android is containerized. Docker takes this concept to the far more complicated and sophisticated world of back-end data center applications.

Roger: How critical is the real-time aspect of modern IT? How quickly is it growing?

Ben: It's absolutely critical. Seconds matter in terms of time to deploy and update. Milliseconds matter in terms of scaling.

Docker enables customers to take a development>test>stage>deploy cycle that used to take weeks and shrink it to seconds or minutes. And, we make it possible to take applications and scale them and burst them across clouds in fractions of a second.


Roger: How key is the role of Big Data in developing your solutions? How important is the term Big Data to you?

Ben: Although we aren't a Big Data solution per se, we are a great solution for many big data problems. In Big Data application, the same application is rapidly scaled across hundreds or thousands of machines and often scaled down just as quickly.

A Docker container, which is lightweight, easy to modify, and easy to migrate to any server, is far more appropriate for a scale-out, performance sensitive, big data application than a VM.

Roger: How do the issues outlined above affect DevOps? What skills must organizations have to develop in this intense environment successfully today?

Ben: Today's organizations need to take complex and constantly changing applications and deploy them across complex and constantly changing production environments. Think of multiple applications, multiple versions, multiple components, and multiple languages being made to work across VMs, public clouds, private clouds, open stack clusters, and customer environments, and you quickly get what we like to call the matrix from hell.

This is compounded by the fact that you need to get a meeting of the minds between developers, who like to try new things and make changes; and operations, who like things consistent, repeatable, secure, and scalable.

Docker solves this problem by providing a clean separation of concerns. Developers can change things "inside" the container. But, the "outside" of Docker containers remains the same; all Docker containers stop, start, migrate, log, etc. the same way. This turns a DevOps nightmare into a DevOps dream come true.

Contact Me on Twitter

More Stories By Roger Strukhoff

Roger Strukhoff (@IoT2040) is Executive Director of the Tau Institute for Global ICT Research, with offices in Illinois and Manila. He is Conference Chair of @CloudExpo & @ThingsExpo, and Editor of SYS-CON Media's CloudComputing BigData & IoT Journals. He holds a BA from Knox College & conducted MBA studies at CSU-East Bay.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@DevOpsSummit Stories
"Our strategy is to focus on the hyperscale providers - AWS, Azure, and Google. Over the last year we saw that a lot of developers need to learn how to do their job in the cloud and we see this DevOps movement that we are catering to with our content," stated Alessandro Fasan, Head of Global Sales at Cloud Academy, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Five years ago development was seen as a dead-end career, now it’s anything but – with an explosion in mobile and IoT initiatives increasing the demand for skilled engineers. But apart from having a ready supply of great coders, what constitutes true ‘DevOps Royalty’? It’ll be the ability to craft resilient architectures, supportability, security everywhere across the software lifecycle. In his keynote at @DevOpsSummit at 20th Cloud Expo, Jeffrey Scheaffer, GM and SVP, Continuous Delivery Business Unit at CA Technologies, will share his vision about the true ‘DevOps Royalty’ and how it will take a new breed of digital cloud craftsman, architecting new platforms with a new set of tools to achieve it. He will also present a number of important insights and findings from a recent cloud and DevOps study – outlining the synergies high performance teams are exploiting to gain significant busin...
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chief Architect at Cedexis, covered strategies for orchestrating global traffic achieving the highest-quality end-user experience while spanning multiple clouds and data centers and reacting at the velocity of modern development teams.
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be looking at some significant engineering investment. On-demand, serverless computing enables developers to try out a fleet of devices on IoT gateways with ease. With a sensor simulator built on top of AWS Lambda, it’s possible to elastically generate device sensors that report their state to the cloud.