Welcome!

@DevOpsSummit Authors: Elizabeth White, Liz McMillan, Kevin Jackson, Pat Romanski, Tim Hinds

Related Topics: @DevOpsSummit, Microservices Expo, Linux Containers

@DevOpsSummit: Blog Post

Docker Abstraction for PaaS By @ActiveState | @DevOpsSummit [#DevOps]

One of the promises of PaaS has always been to support developers in doing that thing that they do well

Is Docker the Right Abstraction for PaaS?

Docker has been quickly adopted by nearly everyone and incorporated into everything from cloud technologies, to continuous integration and build systems, to solo developers working exclusively on their laptop. Heck, even Microsoft is getting in on this! It was born in PaaS (dotCloud) and this is the place where it makes a lot of sense. Ephemeral fast-starting single-process containers that can be distributed across a large cluster is where Docker shines.

Docker has been Stackato's container implementation for a year now, responsible for provisioning and managing the life-cycle of who knows how many Linux containers. The next question is how do we start exposing Docker features to end users, rather than having them as an unexposed implementation detail. These features bring portability with a simple packaging mechanism for building and distributing an application in a consistent way across not only a specific PaaS, but anywhere that Docker runs.

Docker seems like the obvious choice for PaaS. Engineers building PaaS solutions are excited by it and many developers are banging down the door demanding it. There is no doubt every PaaS worth its salt will, at some point in the near future, implement mechanisms for developers to drop in their pre-built Docker images - or at least a Dockerfile.

But let's take a step back from this euphoria for a minute and look at the bigger picture. Is this really the right abstraction for developers? Docker brings a lot to the table, but as with everything there are pros and cons. So what do we lose here?

One of the promises of PaaS has always been to support developers in doing that thing that they do well. They write application code. An experienced JavaScript, Python, Go, Ruby or Java programmer will be able to tell you all the quirks and pitfalls of that language. They will be able to build almost anything you can imagine given the right unreasonable deadline. A great software developer will be able to choose the best programming language for the job. What developers are not necessarily good at, and shouldn't have to be, is taking a bare metal or virtual machine and building up the software stack that supports their code. PaaS has solved this issue and through standardization on Buildpacks, we have given an abstraction that can be broken or changed whenever needed.

There are open-source Buildpacks for most programming languages (even COBOL!). These have been built and evolved by experts in those languages. The Buildpacks deal with the low-level system dependencies and everything about configuring the runtime on which to run these applications. Low-level system dependencies should never be the concern of the software developer. Cloud Foundry based PaaSes, like Stackato, also remove the need for developers to know or care about what a Buildpack actually is. The administrator of the cluster can install all the commonly used language and framework stacks via Buildpacks, and the PaaS will select the best Buildpack for the job - this is mechanism of the Buildpacks themselves, not the PaaS.

With Docker, in regards to PaaS support, we are expecting developers to bring their own pre-built Docker images. This unfortunately means that we are going backwards and now telling developers that they must create their stack themselves outside of the platform. The low-level system dependencies within the container are once again in the domain of the developer. Time spent figuring this stuff out is redundant, not the best use of this engineers expertise and therefore prone to error.

There are mechanisms and evolving best-practices for building Docker images that provide certain software stacks, removing the onus on developers to understand the low-level details. Some of these solutions are close to, or leverage, Buildpacks themselves. This is the right direction, but still requires systems that are outside of the consistent managed platform of PaaS and this leads to potential fragmentation across a large organization.

The second issue with Docker images being built outside of the PaaS should concern Operations teams the most. Vulnerabilities are a growing fact of life in managing deployed software. We have seen many this year. When a vulnerability is announced Operations need to know two things: 1) are we affected 2) if so, how do we patch our systems quickly?

One direction that Buildpack development is going is with meta-data accountability. Currently, a Buildpack will look at the application code, decide which dependencies need to be installed and install them. After that, knowledge of what is installed in that container is lost. Retaining this information will be very powerful moving forward. For instance, as a PaaS administrator I should be able to query the system to find out exactly which applications are running a specific version of the Ruby or Java runtime binaries. Having this information at your fingertips the second that a vulnerability is announced will be incredibility powerful.

When we package our Docker images outside of the system and pass it into the PaaS, we're essentially giving the Operations team a black box and there is no easy way for them to determine all the things that may be installed in that black box. This is a big problem for a large scale organization when a serious vulnerability is announced. These are concerns that ActiveState are thinking about and we currently use Buildpacks to solve this issue. Unfortunately, we see few solutions here with Docker images, yet.

I see most PaaSes supporting both Docker images and Buildpacks for the near future. The demand for exposing Docker integration and the flexibility that Docker provides makes this a no-brainer. But Buildpacks still show a great deal of value and as we see enterprise-grade features such as accountability being built into Buildpacks. Therefore, it will be up to the enterprise as to which solution works better for them.

The post Is Docker the Right Abstraction for PaaS? appeared first on ActiveState.

More Stories By Phil Whelan

Phil Whelan has been a software developer at ActiveState since early 2012 and has been involved in many layers of the Stackato product, from the JavaScript-based web console right through to the Cloud Controller API. He has been the lead developer on kato, the command-line tool for administering Stackato. His current role at ActiveState is Technology Evangelist.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@DevOpsSummit Stories
"Storpool does only block-level storage so we do one thing extremely well. The growth in data is what drives the move to software-defined technologies in general and software-defined storage," explained Boyan Ivanov, CEO and co-founder at StorPool, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, which can process our conversational commands and orchestrate the outcomes we request across our personal and professional realm of connected devices.
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the public cloud best suits your organization, and what the future holds for operations and infrastructure engineers in a post-container world. Is a serverless world inevitable?
ChatOps is an emerging topic that has led to the wide availability of integrations between group chat and various other tools/platforms. Currently, HipChat is an extremely powerful collaboration platform due to the various ChatOps integrations that are available. However, DevOps automation can involve orchestration and complex workflows. In his session at @DevOpsSummit at 20th Cloud Expo, Himanshu Chhetri, CTO at Addteq, will cover practical examples and use cases such as self-provisioning infrastructure/applications, self-remediation workflows, integrating monitoring and complimenting integrations between Atlassian tools and other top tools in the industry.
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory? In her Day 2 Keynote at @DevOpsSummit at 21st Cloud Expo, Aruna Ravichandran, VP, DevOps Solutions Marketing, CA Technologies, was joined by a panel of industry experts and real-world practitioners who shared their insight into an emerging set of best practices that lie at the heart of today's digital transformation.