Click here to close now.

Welcome!

DevOps Journal Authors: Carmen Gonzalez, Liz McMillan, Elizabeth White, Pat Romanski, Andreas Grabner

Related Topics: DevOps Journal, Java, Linux, Open Source, Cloud Expo, Big Data Journal

DevOps Journal: Blog Post

Docker + Stackato: The Perfect Workload Portability Solution

Looking to ease application development and deployment and also retain the maximum flexibility in terms of deployment location?

Looking to ease application development and deployment and also retain the maximum flexibility in terms of deployment location?

If you work in technology, you'd have to have been under a rock to have not heard about Docker. In a nutshell, Docker provides a lightweight container for code that can be installed onto a Linux system, providing both an execution environment for applications and partitioning to securely segregate sets of application code from one another. While this high-level description doesn't sound that exciting, Docker addresses three key issues confronting application developers:

  • Efficient resource use: One of the problems confronting IT organizations is how to get the most benefit from computing resources; this translates as to how to raise utilization of servers to ensure that their cost and power use is actually applied to computing rather than being used to operate a server that is running, but performing no useful work. The previous solution to this issue was virtualization, which enabled a single server to support multiple virtual machines, each containing an operating system and software payload. While virtualization helps address the issue of utilization, it seems obvious that operating multiple virtual machines, each with its own operating system presents the problem that a lot of the server's resources may be tied up with running multiple operating systems rather than application code, which is where all the value resides. Said another way, the operating system is a necessary evil, but it's not where business value resides. A solution that reduces the proportion of the server's overall processing capacity devoted to running operating systems would be extremely valuable. Docker is that solution -- it requires only one operating system per server and uses containers to provide the segregated execution environment that individual virtual machines previously provided. My colleague Phil Whelan used an analogy of a server as being like a jar -- and choosing to use sand rather than marbles to most efficiently fill the jar; just so, containers are more efficient as optimizing overall server use and waste less computing capacity (i.e., leave less "wasted space in the jar") than virtualization.
  • Workload encapsulation: A container offers exactly what it sounds like -- an environment to hold something. In the case of Docker, it holds a set of executable code that runs inside the Docker container. This means that the container encapsulates the execution code, and that the container can be transferred from one location to another. This simplifies the application lifecycle, as containers can be passed from one group to another with no need for separate groups to recreate the same application into different environments via recompiling and repeated configuration.
  • Workload portability: It's a fact of life that businesses use a variety of application deployment environments -- a single company may deploy applications into an on-premise VMware vSphere environment, a virtual private cloud run by an OpenStack-based provider, and also Amazon Web Services. Each uses a different hypervisor and has a different set of operational controls, which presents a challenge to organizations that desire greater flexibility and choice for workload deployment. The previous vendor solution to this issue was OVF -- the Open Virtualization Format -- which promised to enable workload portability, but in practice ended up being a mechanism to transport proprietary virtual machine images along with operational metadata. This reduced the vision of true workload portability to vendor-constrained islands of technology homogeneity, which didn't really address end user objectives at all. By contrast, Docker containers are easily transported and run on any hypervisor environment that supports Linux -- which is all of them. Therefore, Docker is a much better solution to workload portability and addresses a key user desire. You'll hear much more about how Docker enables workload portability over the coming months and years.

Given the advantages Docker offers, it's easy to understand why it has been so avidly embraced by the vendor and user community. It addresses efficient use and provides for better workload portability.

On the other hand, Docker does not solve all application problems. In fact, its benefits expose a significant issue: if it's easier to run and distribute workloads, then efficient creation and management of application workloads is all the more important. And Docker does nothing to ease application creation and management -- it merely does a fantastic job of deploying workloads once they are created.

And application creation and management is where Stackato shines. Its Cloud Foundry-based framework accelerates application development and management by providing easy to use code deployment inside a Docker container, as well as predefined and managed application data storage (i.e., database). Moreover, Stackato makes it easy to grow and shrink the pool of Docker containers within which an application operates.

For organizations looking to ease application development and deployment and that also want to retain the maximum flexibility in terms of deployment location, combining Docker and Stackato is the perfect solution. In fact, ActiveState agrees with this so much that it integrates Docker into its Stackato product.

So if you're a company or IT organization looking to address the issue of workload portability, Docker and Stackato is a good place to start your search.

Source: ActiveState, originally published, here.

More Stories By Bernard Golden

Bernard Golden has vast experience working with CIOs to incorporate new IT technologies and meet their business goals. Prior to joining ActiveState, he was Senior Director, Cloud Computing Enterprise Solutions, for Dell Enstratius. Before joining Dell Enstratius, Bernard was CEO of HyperStratus, a Silicon Valley cloud computing consultancy that focuses on application security, system architecture and design, TCO analysis, and project implementation. He is also the Cloud Computing Advisor for CIO Magazine and was named a "Top 50 Cloud Computing Blog" by Sys-Con Media. Bernard's writings on cloud computing have been published by The New York Times and the Harvard Business Review and he is the author of Virtualization for Dummies, Amazon Web Services for Dummies and co-author of Creating the Infrastructure for Cloud Computing. Bernard has an MBA in Business and Finance from the University of California, Berkeley.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@DevOpsSummit Stories
Business and IT leaders today need better application delivery capabilities to support critical new innovation. But how often do you hear objections to improving application delivery like, “I can harden it against attack, but not on this timeline”; “I can make it better, but it will cost more”; “I can deliver faster, but not with these specs”; or “I can stay strong on cost control, but quality will suffer”? In the new application economy, these tradeoffs are no longer acceptable. Customers will abandon your brand forever for a slow response or a privacy breach; competitors will steal critical ...
Red Hat has launched the Red Hat Cloud Innovation Practice, a new global team of experts that will assist companies with more quickly on-ramping to the cloud. They will do this by providing solutions and services such as validated designs with reference architectures and agile methodology consulting, training, and support. The Red Hat Cloud Innovation Practice is born out of the integration of technology and engineering expertise gained through the company’s 2014 acquisitions of leading Ceph storage system provider, Inktank, and cloud computing services provider, eNovance. Both companies pro...
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures traffic gets delivered faster, safer, and more reliably than ever.
The free version of KEMP Technologies' LoadMaster™ application load balancer is now available for unlimited use, making it easy for IT developers and open source technology users to benefit from all the features of a full commercial-grade product at no cost. It can be downloaded at FreeLoadBalancer.com. Load balancing, security and traffic optimization are all key enablers for application performance and functionality. Without these, application services will not perform as expected or have the required resilience. During testing and development, open source and free components, including loa...
VictorOps is making on-call suck less with the only collaborative alert management platform on the market. With easy on-call scheduling management, a real-time incident timeline that gives you contextual relevance around your alerts and powerful reporting features that make post-mortems more effective, VictorOps helps your IT/DevOps team solve problems faster.
Skytap Inc., has appointed David Frost as vice president of professional services. David joins Skytap from Deloitte Consulting where he served as Managing Director leading SAP, Cloud, and Advanced Technology Services. At Skytap, David will head the company's professional services organization, and spearhead a new consulting practice that will guide IT organizations through the adoption of DevOps best practices. David's appointment comes on the heels of Skytap's recent $35 million Series D funding announcement, and record growth in 2014.
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices rather than mimicking legacy server virtualization workflows and architectures.
Platform-as-a-Service (PaaS) is a technology designed to make DevOps easier and allow developers to focus on application development. The PaaS takes care of provisioning, scaling, HA, and other cloud management aspects. Apache Stratos is a PaaS codebase developed in Apache and designed to create a highly productive developer environment while also supporting powerful deployment options. Integration with the Docker platform, CoreOS Linux distribution, and Kubernetes container management system brings more scalability and flexibility to Apache Stratos PaaS. In his session at 15th Cloud Expo,...
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, it is now feasible to create a rich desktop and tuned mobile experience with a single codebase, without compromising performance or usability.
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, will explain the best practices of continuous testing at high scale, which is relevant to small scale DevOps, and if there is an expectation of growth as the number of build targe...
Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch of Docker's initial release in March of 2013, interest was revved up several notches. Then late last...
SYS-CON Events announced today Arista Networks will exhibit at SYS-CON's DevOps Summit 2015 New York, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Arista Networks was founded to deliver software-driven cloud networking solutions for large data center and computing environments. Arista’s award-winning 10/40/100GbE switches redefine scalability, robustness, and price-performance, with over 3,000 customers and more than three million cloud networking ports deployed worldwide.
Application metrics, logs, and business KPIs are a goldmine. It’s easy to get started with the ELK stack (Elasticsearch, Logstash and Kibana) – you can see lots of people coming up with impressive dashboards, in less than a day, with no previous experience. Going from proof-of-concept to production tends to be a bit more difficult, unfortunately, and it tends to gobble up our attention, time, and money. In his session at DevOps Summit, Otis Gospodnetić, co-author of Lucene in Action and founder of Sematext, will share the architecture and decisions behind Sematext’s services for handling larg...
The speed of product development has increased massively in the past 10 years. At the same time our formal secure development and SDL methodologies have fallen behind. This forces product developers to choose between rapid release times and security. In his session at DevOps Summit, Michael Murray, Director of Cyber Security Consulting and Assessment at GE Healthcare, examined the problems and presented some solutions for moving security into the DevOps lifecycle to ensure that we get fast AND secure.
Docker is becoming very popular--we are seeing every major private and public cloud vendor racing to adopt it. It promises portability and interoperability, and is quickly becoming the currency of the Cloud. In his session at DevOps Summit, Bart Copeland, CEO of ActiveState, discussed why Docker is so important to the future of the cloud, but will also take a step back and show that Docker is actually only one piece of the puzzle. Copeland will outline the bigger picture of where Docker fits and the remaining infrastructure that is needed for large scale adoption by enterprise IT.
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, Capital One now has 500+ Agile Teams delivering quality software via Agile and DevOps practices.
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of being reserved to the largest, most complex application stacks.
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driving the company’s cloud security strategy, roadmap and implementation in support of corporate innova...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, Capital One now has 500+ Agile Teams delivering quality software via Agile and DevOps practices.
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
The cloud has transformed how we think about software quality. Instead of preventing failures, we must focus on automatic recovery from failure. In other words, resilience trumps traditional quality measures. Continuous delivery models further squeeze traditional notions of quality. Remember the venerable project management Iron Triangle? Among time, scope, and cost, you can only fix two or quality will suffer. Only in today's DevOps world, continuous testing, integration, and deployment upend the time metric, the DevOps cadence reinvents project scope, and cost metrics expand past software ...
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mockups and enhance them all the way through functional prototypes, to final working applications. Lea...
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driving the company’s cloud security strategy, roadmap and implementation in support of corporate innova...
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of being reserved to the largest, most complex application stacks.
Software is eating the world. Companies that were not previously in the technology space now find themselves competing with Google and Amazon on speed of innovation. As the innovation cycle accelerates, companies must embrace rapid and constant change to both applications and their infrastructure, and find a way to deliver speed and agility of development without sacrificing reliability or efficiency of operations. In her Day 2 Keynote DevOps Summit, Victoria Livschitz, CEO of Qubell, discussed how IT organizations can automate just-in-time assembly of application environments - each built fo...