Welcome!

@DevOpsSummit Authors: Mehdi Daoudi, Radu Gheorghe, Pat Romanski, Elizabeth White, Derek Weeks

Related Topics: @CloudExpo, SDN Journal, @DevOpsSummit

@CloudExpo: Blog Feed Post

Is PaaS Dying?

The ‘platform’ tier in the middle of cloud computing’s architecture is being squeezed

The ‘platform’ tier in the middle of cloud computing’s architecture is being squeezed, folded and reshaped beyond recognition. Even with continued investment, can it survive the transformative pressures forcing down upon it from the software/application layer above, or the apparently inexorable upward movement from the infrastructure layer upon which it rests?

To look at recent investments and enthusiastic headlines, it would be easy to assume that Platform as a Service (or PaaS) is on the up. RedHat recently trumpeted the launch of OpenShift Enterprise — a ‘private PaaS,’ whatever that might be. Eagerly tracked super-startup Pivotal pushed PivotalOne out to the world, strengthening the position of the Cloud Foundry PaaS offering upon which it sits. Apprenda, a PaaS that almost predates wider recognition of the term, secured an additional $16 million to continue expanding. And, more tightly integrated into Salesforce’s latest vision for world domination, Heroku continues to attract enthusiasts.

457px-Nuremberg_chronicles_-_Phoenix_(CIIIIv)And yet, the role of rich PaaS ‘solutions’ is under increasing pressure. More lightweight approaches such as Docker are attracting attention and, perhaps more importantly, the other layers of the cloud architecture are adding capabilities that look increasingly PaaS-like. The orchestration capabilities of Amazon’s Elastic Beanstalk, for example, mean that many (but by no means all) AWS users no longer need the PaaS tools they might previously have integrated into their toolkit. We’ll keep needing PaaS functionality, but it may not be long before the idea of a separate PaaS solution no longer makes good sense.

For many years, some of the most basic explanations of cloud have broken it into three layers;

  • at the top, Applications, Services and Software. The things most people see and interact with. The GMails and Salesforces and Boxes of the world;
  • at the bottom, Infrastructure. The nuts and bolts. The engine room. The servers and routers and networks. To paraphrase former colleague Ray Lester, the stuff;
  • and, in the middle, the Platform. The piece that assembles bits of network and bits of infrastructure and bits of code, and simplifies the process of knitting them all together in order to deliver one of those apps or services. The glue, if you will.

The role of the platform is clear, compelling, and powerful. It should be the fundamental piece, far more important and interesting than a bunch of cheap virtual machines running on commodity hardware. It should be the driving force behind cloud; the reason cloud can continue to transform businesses and business models around the world. It should be all that and more, but PaaS as a category falls far short of this promise.

In early planning for VentureBeat’s second CloudBeat conference, in 2012, Ben Kepes and I argued for PaaS, PaaS vendors and PaaS customers to be given real prominence. We knew that the story of the glue was where this whole industry needed to shift. That’s still true today. The glue remains important, but maybe it’s less clear that we need — or that the market can sustain — glue companies. Instead, those capabilities are increasingly to be found across a range of loosely coupled components, or in the offerings of Applications and Infrastructure providers both above and below the PaaS layer. CenturyLink’s recent acquisition of Tier3 is a clear attempt to address exactly this, moving up from the Infrastructure layer.

I’m far from alone in asking questions about PaaS in its current form. My friend René Büst, for example, argued this week that PaaS is typically used for prototyping work but that it doesn’t permit sufficiently granular control for the most efficient delivery of enterprise-grade applications. Possibly an over-simplification, but it’s still a sentiment that is increasingly repeated. Over at Gigaom, Barb Darrow has been asking the question too, most recently with So… do you really need a PaaS? For now, Barb appears unsure about how to answer her own question… but the comments on her post are pretty conclusively in the affirmative. Matt Asay offers his own take on the Twitter conversation which inspired Barb, writing a more up-beat piece for ReadWrite;

The ‘platform as a service’ market—or PaaS, in which cloud companies provide developers with hardware, OS and software tools and libraries—is starting to heat up. IDC predicts it will $14 billion by 2014, and competitors are angling for enterprise wallets.

Matt closes by stressing the importance of solid, sustainable customer adoption; a very different thing from the froth, page views, and jockeying for popularity that seem to underpin much of the conversation today.

Another friend, Ben Kepes, has been tracking the PaaS space closely for several years, and recently commented;

It’s a strange time for PaaS in general. Pivotal One’s flavor of Cloud Foundry seems to be sucking up the vast majority of the mindshare leaving other Cloud Foundry vendors scratching their heads over how to differentiate. At the same time RedHat is trying to achieve some kind of breakout velocity for its own version of PaaS, OpenShift. Stalwarts Heroku (now owned by Salesforce.com) and EngineYard keep turning the PaaS wheel also. Add to that the fact that some of the OpenStack players have decided to create their own PaaS initiative, Solum, and you have for a confused and confusing market. Throw the monsters from Seattle, AWS and Microsoft, on top of that and seemingly there is one vendor for every one of the half dozen companies in the world that have actually made a decision to buy PaaS.

From here in sunny (and, for once, it actually is) East Yorkshire, the various PaaS vendors appear hard-pressed to tell a truly compelling story right now. Bits of their product offering resonate with customers, but only really around those functions that are increasingly aped by other providers from beyond the PaaS world.

The broader story, of deep integration and easy orchestration, raises as many red flags as it does smiles of welcome. Is it about simplicity or loss of controlintegration or lock in? At a time when public, private and hybrid cloud implementations are becoming more mainstream, more mission-critical, and more capable, I hear far more concern expressed about relying upon PaaS than I do about relying upon a cloud infrastructure provider or a SaaS product vendor. Which isn’t to say that those cloud builders don’t need PaaS-like capabilities. They do. They’re just (increasingly) looking elsewhere to find them.

And proponents of PaaS are evolving, too, perhaps faster than the companies with which they were once associated. One of my meetings during a trip to San Francisco earlier this month was with Derek Collison. Formerly CTO Cloud Platforms at VMware (and intimately involved in the incubation of Cloud Foundry), Collison is now CEO of Apcera. Barb Darrow commented as Apcera emerged from stealth last month,

The company describes Continuum as an IT platform that ‘blends the delivery models of IaaS, PaaS, and SaaS’ but overlays (underlays?) them all with technology that handles policy. PaaS is great for developers, according to the blog post, but it’s not enough to deliver applications for grown-up companies that must deal not just with technology but with with compliance and regulatory rules and regs.

(my emphasis)

Collison talks compellingly about the need to move beyond separate consideration of infrastructure, integration and deployment capabilities, and the application. Instead, he sees a continuum of capabilities with different levels of abstraction suited to meeting the real (and complex) requirements of an enterprise and its hybrid IT estate. Policy, Collison argues, “must be a core part of the DNA” in order to truly meet the business needs of an organisation. It’s early days for Apcera, and it remains to be seen whether this is a truly new take on the space or simply a more enterprise-friendly reframing of the problem.

So, is there a future for today’s PaaS companies? It sometimes seems unlikely that they can keep doing what they’re doing and make enough money to grow sustainably. Will they be rendered irrelevant by the increasing capability of offerings above and below them in the stack? Will new and more integrated offerings such as Apcera’s eat their lunch? Or can they rise, Phoenix-like, from the ashes of current business models to meet a broader set of business requirements? If they do, how recognisable will their new incarnations be?

Image of a Phoenix from the 15th Century Nuremberg Chronicle. Public Domain image shared with Wikimedia Commons.

More Stories By Paul Miller

Paul Miller works at the interface between the worlds of Cloud Computing and the Semantic Web, providing the insights that enable you to exploit the next wave as we approach the World Wide Database.

He blogs at www.cloudofdata.com.

@DevOpsSummit Stories
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between what is available in the public cloud and the early private clouds?
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, Alex Lovell-Troy, Director of Solutions Engineering at Pythian, presented a roadmap that can be leveraged by any organization to plan, analyze, evaluate, and execute on moving from configuration management tools to cloud orchestration tools. He also addressed the three major cloud vendors as well as some tools that will work with any cloud.
"Logz.io is a log analytics platform. We offer the ELK stack, which is the most common log analytics platform in the world. We offer it as a cloud service," explained Tomer Levy, co-founder and CEO of Logz.io, in this SYS-CON.tv interview at DevOps Summit, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at @DevOpsSummit at 19th Cloud Expo, Robert Doyle, lead architect at eCube Systems, will examine the issues and need for an agile infrastructure and show the advantages of capturing developer knowledge in an exportable file for migration into production. He will introduce the use of NXTmonitor, a next-generation DevOps tool that captures application environments, dependencies and start/stop procedures in a portable configuration file with an easy-to-use GUI. In addition to capturing configuration information between Development, Test and Production, the case study shows how NXTmonitor can create dependencies, automate health scripts and scalable performance groups to handle peak production loads.
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of DevOps with containers. In addition, he will discuss known issues and solutions for enterprise applications in containers.
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, John Jelinek IV, a web developer at Linux Academy, will discuss why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers with heavy investments in serverless computing, when most of the industry has its eyes on Docker and containers.
DevOps and microservices are permeating software engineering teams broadly, whether these teams are in pure software shops but happen to run a business, such Uber and Airbnb, or in companies that rely heavily on software to run more traditional business, such as financial firms or high-end manufacturers. Microservices and DevOps have created software development and therefore business speed and agility benefits, but they have also created problems; specifically, they have created software security issues.
"There is a huge interest in Kubernetes. People are now starting to use Kubernetes and implement it," stated Sebastian Scheele, co-founder of Loodse, in this SYS-CON.tv interview at DevOps at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the works because of misaligned incentives.
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, Capital One now has 500+ Agile Teams delivering quality software via Agile and DevOps practices.
"We're bringing out a new application monitoring system to the DevOps space. It manages large enterprise applications that are distributed throughout a node in many enterprises and we manage them as one collective," explained Kevin Barnes, President of eCube Systems, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and containers together help companies achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of DevOps with containers and the benefits. In addition, he discussed known issues and solutions for enterprise applications in containers.
SYS-CON Events announced today that Catchpoint, a leading digital experience intelligence company, has been named “Silver Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Catchpoint Systems is a leading Digital Performance Analytics company that provides unparalleled insight into your customer-critical services to help you consistently deliver an amazing customer experience. Designed for digital business, Catchpoint is the only end-user experience monitoring (EUM) platform that can simultaneously capture, index and analyze object-level performance data inline across the most extensive monitor types and node coverage, enabling a smarter, faster way to preempt issues and optimize service delivery. More than 350 customers in over 30 countries trust Catchpoint to strengthen their brand and grow their bu...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.
Updating DevOps to the latest production data slows down your development cycle. Probably it is due to slow, inefficient conventional storage and associated copy data management practices. In his session at @DevOpsSummit at 20th Cloud Expo, Dhiraj Sehgal, in Product and Solution at Tintri, will talk about DevOps and cloud-focused storage to update hundreds of child VMs (different flavors) with updates from a master VM in minutes, saving hours or even days in each development cycle. He will also discuss how the "Ops" side of DevOps is making their life easier and becoming invisible to developers for storage-related provisioning and application performance.
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture.
“RackN is a software company and we take how a hybrid infrastructure scenario, which consists of clouds, virtualization, traditional data center technologies - how to make them all work together seamlessly from an operational perspective,” stated Dan Choquette, Founder of RackN, in this SYS-CON.tv interview at @DevOpsSummit at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"Tintri was started in 2008 with the express purpose of building a storage appliance that is ideal for virtualized environments. We support a lot of different hypervisor platforms from VMware to OpenStack to Hyper-V," explained Dan Florea, Director of Product Management at Tintri, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal today!
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Francisco, is developing the next generation of cloud monitoring required for microservices and DevOps.
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in Embedded and IoT solutions, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 7-9, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and Embedded Systems worldwide. Supermicro is committed to protecting the environment through its “We Keep IT Green®” initiative and provides customers with the most energy-efficient, environmentally friendly solutions available on the market.
SYS-CON Events announced today that Linux Academy, the foremost online Linux and cloud training platform and community, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Linux Academy was founded on the belief that providing high-quality, in-depth training should be available at an affordable price. Industry leaders in quality training, provided services, and student certification passes, its goal is to change lives by teaching Linux and cloud technology to the tens of thousands of students that learn at the Linux Academy.