Welcome!

@DevOpsSummit Authors: Liz McMillan, Craig Lowell, Carmen Gonzalez, Pat Romanski, Elizabeth White

Related Topics: @CloudExpo, SDN Journal, @DevOpsSummit

@CloudExpo: Blog Feed Post

Is PaaS Dying?

The ‘platform’ tier in the middle of cloud computing’s architecture is being squeezed

The ‘platform’ tier in the middle of cloud computing’s architecture is being squeezed, folded and reshaped beyond recognition. Even with continued investment, can it survive the transformative pressures forcing down upon it from the software/application layer above, or the apparently inexorable upward movement from the infrastructure layer upon which it rests?

To look at recent investments and enthusiastic headlines, it would be easy to assume that Platform as a Service (or PaaS) is on the up. RedHat recently trumpeted the launch of OpenShift Enterprise — a ‘private PaaS,’ whatever that might be. Eagerly tracked super-startup Pivotal pushed PivotalOne out to the world, strengthening the position of the Cloud Foundry PaaS offering upon which it sits. Apprenda, a PaaS that almost predates wider recognition of the term, secured an additional $16 million to continue expanding. And, more tightly integrated into Salesforce’s latest vision for world domination, Heroku continues to attract enthusiasts.

457px-Nuremberg_chronicles_-_Phoenix_(CIIIIv)And yet, the role of rich PaaS ‘solutions’ is under increasing pressure. More lightweight approaches such as Docker are attracting attention and, perhaps more importantly, the other layers of the cloud architecture are adding capabilities that look increasingly PaaS-like. The orchestration capabilities of Amazon’s Elastic Beanstalk, for example, mean that many (but by no means all) AWS users no longer need the PaaS tools they might previously have integrated into their toolkit. We’ll keep needing PaaS functionality, but it may not be long before the idea of a separate PaaS solution no longer makes good sense.

For many years, some of the most basic explanations of cloud have broken it into three layers;

  • at the top, Applications, Services and Software. The things most people see and interact with. The GMails and Salesforces and Boxes of the world;
  • at the bottom, Infrastructure. The nuts and bolts. The engine room. The servers and routers and networks. To paraphrase former colleague Ray Lester, the stuff;
  • and, in the middle, the Platform. The piece that assembles bits of network and bits of infrastructure and bits of code, and simplifies the process of knitting them all together in order to deliver one of those apps or services. The glue, if you will.

The role of the platform is clear, compelling, and powerful. It should be the fundamental piece, far more important and interesting than a bunch of cheap virtual machines running on commodity hardware. It should be the driving force behind cloud; the reason cloud can continue to transform businesses and business models around the world. It should be all that and more, but PaaS as a category falls far short of this promise.

In early planning for VentureBeat’s second CloudBeat conference, in 2012, Ben Kepes and I argued for PaaS, PaaS vendors and PaaS customers to be given real prominence. We knew that the story of the glue was where this whole industry needed to shift. That’s still true today. The glue remains important, but maybe it’s less clear that we need — or that the market can sustain — glue companies. Instead, those capabilities are increasingly to be found across a range of loosely coupled components, or in the offerings of Applications and Infrastructure providers both above and below the PaaS layer. CenturyLink’s recent acquisition of Tier3 is a clear attempt to address exactly this, moving up from the Infrastructure layer.

I’m far from alone in asking questions about PaaS in its current form. My friend René Büst, for example, argued this week that PaaS is typically used for prototyping work but that it doesn’t permit sufficiently granular control for the most efficient delivery of enterprise-grade applications. Possibly an over-simplification, but it’s still a sentiment that is increasingly repeated. Over at Gigaom, Barb Darrow has been asking the question too, most recently with So… do you really need a PaaS? For now, Barb appears unsure about how to answer her own question… but the comments on her post are pretty conclusively in the affirmative. Matt Asay offers his own take on the Twitter conversation which inspired Barb, writing a more up-beat piece for ReadWrite;

The ‘platform as a service’ market—or PaaS, in which cloud companies provide developers with hardware, OS and software tools and libraries—is starting to heat up. IDC predicts it will $14 billion by 2014, and competitors are angling for enterprise wallets.

Matt closes by stressing the importance of solid, sustainable customer adoption; a very different thing from the froth, page views, and jockeying for popularity that seem to underpin much of the conversation today.

Another friend, Ben Kepes, has been tracking the PaaS space closely for several years, and recently commented;

It’s a strange time for PaaS in general. Pivotal One’s flavor of Cloud Foundry seems to be sucking up the vast majority of the mindshare leaving other Cloud Foundry vendors scratching their heads over how to differentiate. At the same time RedHat is trying to achieve some kind of breakout velocity for its own version of PaaS, OpenShift. Stalwarts Heroku (now owned by Salesforce.com) and EngineYard keep turning the PaaS wheel also. Add to that the fact that some of the OpenStack players have decided to create their own PaaS initiative, Solum, and you have for a confused and confusing market. Throw the monsters from Seattle, AWS and Microsoft, on top of that and seemingly there is one vendor for every one of the half dozen companies in the world that have actually made a decision to buy PaaS.

From here in sunny (and, for once, it actually is) East Yorkshire, the various PaaS vendors appear hard-pressed to tell a truly compelling story right now. Bits of their product offering resonate with customers, but only really around those functions that are increasingly aped by other providers from beyond the PaaS world.

The broader story, of deep integration and easy orchestration, raises as many red flags as it does smiles of welcome. Is it about simplicity or loss of controlintegration or lock in? At a time when public, private and hybrid cloud implementations are becoming more mainstream, more mission-critical, and more capable, I hear far more concern expressed about relying upon PaaS than I do about relying upon a cloud infrastructure provider or a SaaS product vendor. Which isn’t to say that those cloud builders don’t need PaaS-like capabilities. They do. They’re just (increasingly) looking elsewhere to find them.

And proponents of PaaS are evolving, too, perhaps faster than the companies with which they were once associated. One of my meetings during a trip to San Francisco earlier this month was with Derek Collison. Formerly CTO Cloud Platforms at VMware (and intimately involved in the incubation of Cloud Foundry), Collison is now CEO of Apcera. Barb Darrow commented as Apcera emerged from stealth last month,

The company describes Continuum as an IT platform that ‘blends the delivery models of IaaS, PaaS, and SaaS’ but overlays (underlays?) them all with technology that handles policy. PaaS is great for developers, according to the blog post, but it’s not enough to deliver applications for grown-up companies that must deal not just with technology but with with compliance and regulatory rules and regs.

(my emphasis)

Collison talks compellingly about the need to move beyond separate consideration of infrastructure, integration and deployment capabilities, and the application. Instead, he sees a continuum of capabilities with different levels of abstraction suited to meeting the real (and complex) requirements of an enterprise and its hybrid IT estate. Policy, Collison argues, “must be a core part of the DNA” in order to truly meet the business needs of an organisation. It’s early days for Apcera, and it remains to be seen whether this is a truly new take on the space or simply a more enterprise-friendly reframing of the problem.

So, is there a future for today’s PaaS companies? It sometimes seems unlikely that they can keep doing what they’re doing and make enough money to grow sustainably. Will they be rendered irrelevant by the increasing capability of offerings above and below them in the stack? Will new and more integrated offerings such as Apcera’s eat their lunch? Or can they rise, Phoenix-like, from the ashes of current business models to meet a broader set of business requirements? If they do, how recognisable will their new incarnations be?

Image of a Phoenix from the 15th Century Nuremberg Chronicle. Public Domain image shared with Wikimedia Commons.

More Stories By Paul Miller

Paul Miller works at the interface between the worlds of Cloud Computing and the Semantic Web, providing the insights that enable you to exploit the next wave as we approach the World Wide Database.

He blogs at www.cloudofdata.com.

@DevOpsSummit Stories
"We are a modern development application platform and we have a suite of products that allow you to application release automation, we do version control, and we do application life cycle management," explained Flint Brenton, CEO of CollabNet, in this SYS-CON.tv interview at DevOps at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal today!
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of the Progress Corticon and Rollbase businesses, discussed and provided a deep understanding of the low-code application platforms that address these concerns.
Bert Loomis was a visionary. This general session will highlight how Bert Loomis and people like him inspire us to build great things with small inventions. In their general session at 19th Cloud Expo, Harold Hannon, Architect at IBM Bluemix, and Michael O'Neill, Strategic Business Development at Nvidia, discussed the accelerating pace of AI development and how IBM Cloud and NVIDIA are partnering to bring AI capabilities to "every day," on-demand. They also reviewed two "free infrastructure" programs available to startups and innovators.
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor – all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organizations must focus on what is most relevant to deliver value, reduce IT complexity, create more repeatable agile-based processes and leverage increasingly secure and stable, cloud-based infrastructure platforms.
"Dice has been around for the last 20 years. We have been helping tech professionals find new jobs and career opportunities," explained Manish Dixit, VP of Product and Engineering at Dice, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Rapid innovation, changing business landscapes, and new IT demands force businesses to make changes quickly. In the eyes of many, containers are at the brink of becoming a pervasive technology in enterprise IT to accelerate application delivery. In this presentation, attendees learned about the: The transformation of IT to a DevOps, microservices, and container-based architecture What are containers and how DevOps practices can operate in a container-based environment A demonstration of how Docker and Kubernetes reduce software delivery cycle times, drive automation, and increase efficiency How other organizations are using DevOps + containers and how to replicate their success
"Venafi has a platform that allows you to manage, centralize and automate the complete life cycle of keys and certificates within the organization," explained Gina Osmond, Sr. Field Marketing Manager at Venafi, in this SYS-CON.tv interview at DevOps at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereum.
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes how developers and operators work together to streamline cohesive systems.
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Phil Hombledal, Solution Architect at CollabNet, discussed how customers are able to achieve a level of transparency that enables everyone from Planning-to-Ops to make informed decisions based on business priority and leverage automation to accelerate identifying issues and fast fix to drive continuous feedback and KPI insight.
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud using both containers and VMs.
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no idea how to get a proper answer.
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to monitor and adjust functions like performance, capacity, caching, security, optimization, uptime and service levels; identify trends or patterns to forecast future requirements; detect problems before they result in failures or downtime; and convert insight into actions like changing policies, storage tiers, or DR strategies.
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of SolidFire, discussed how to leverage this concept to seize on the creativity and business agility to make it real.
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and on the other side, organizations that will find themselves as roadkill on the technology highway.
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
"We are an all-flash array storage provider but our focus has been on VM-aware storage specifically for virtualized applications," stated Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Application transformation and DevOps practices are two sides of the same coin. Enterprises that want to capture value faster, need to deliver value faster – time value of money principle. To do that enterprises need to build cloud-native apps as microservices by empowering teams to build, ship, and run in production. In his session at @DevOpsSummit at 19th Cloud Expo, Neil Gehani, senior product manager at HPE, discussed what every business should plan for how to structure their teams to deliver.
Much of the value of DevOps comes from a (renewed) focus on measurement, sharing, and continuous feedback loops. In increasingly complex DevOps workflows and environments, and especially in larger, regulated, or more crystallized organizations, these core concepts become even more critical. In his session at @DevOpsSummit at 18th Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, showed how, by focusing on 'metrics that matter,' you can provide objective, transparent, and meaningful feedback on DevOps processes to all stakeholders. Learn from real-life examples how to use the data generated throughout application delivery to continuously identify, measure, and improve deployment speed, code quality, process efficiency, outsourcing value, security coverage, audit success, customer satisfaction, and business alignment.
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and GM, discussed how clients in this new era of innovation can apply data, technology, plus human ingenuity to springboard to advance new business value and opportunities.
The pace of innovation, vendor lock-in, production sustainability, cost-effectiveness, and managing risk… In his session at 18th Cloud Expo, Dan Choquette, Founder of RackN, discussed how CIOs are challenged finding the balance of finding the right tools, technology and operational model that serves the business the best. He also discussed how clouds, open source software and infrastructure solutions have benefits but also drawbacks and how workload and operational portability between vendors and platforms give control back to the users and drives innovation.
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.