Welcome!

@DevOpsSummit Authors: Dalibor Siroky, Pat Romanski, Elizabeth White, Liz McMillan, Stackify Blog

Related Topics: @DXWorldExpo, Linux Containers, @CloudExpo, SDN Journal, @DevOpsSummit

@DXWorldExpo: Blog Post

A Brief History of #Serverless Evolution | @DevOpsSummit #CloudNative #DevOps #Microservices

Monolith to Microservices to Serverless

The need for greater agility and scalability necessitated the digital transformation in the form of following equation: monolithic to microservices to serverless architecture (FaaS). To keep up with the cut-throat competition, the organisations need to update their technology stack to make software development their differentiating factor.

Thus microservices architecture emerged as a potential method to provide development teams with greater flexibility and other advantages, such as the ability to deliver applications at warp speed using infrastructure as a service (IaaS) and platform as a service (PaaS) environments.

The emergence of microservice was to break the monolithic applications into smaller services, with each one of them having their own business logic. With a monolithic architecture, a single faulty service can bring down the entire app server and all the services running on it.

However, it is a different case with microservices where each service runs in its own container and thus application architects can develop, manage and scale these services independently.

The microservices can be scaled and deployed separately and written in different programming languages. But a key decision many organisations face when deploying their microservices architecture is choosing between IaaS and PaaS environments.

Microservices involve source code management, a built server, code repository, image repository, cluster manager, container scheduler, dynamic service discovery, software load balancer and cloud load balancer. More to that, it also needs a mature agile and DevOps team to support continuous delivery.

Entering the Serverless

Serverless Architecture takes the step further by making an application more granular to the level of functions and events. Thus, it is pretty clear that the unit of work is getting smaller and smaller.  We've gone from monoliths to microservices to functions. FaaS also improves the shortcomings of PaaS model i.e. scaling and friction between development operations.

It is quite challenging to scale a microservices hosted on PaaS. The architecture may have elements written in different programming languages, deployed across multiple clouds and on-premise locations, running on multiple containers.

When the app demand increases, all the underlying components have to be coordinated to scale, or you have to be able to identify which individual elements need to scale to address the surge in demand. Even if you setup your PaaS applications to auto-scale you won't be doing this to the level of individual requests unless you know the traffic trend. And this is what makes FaaS application way more efficient when it comes to cost.

However, there will be space for both FaaS and microservices to co-exist as there are certain things which you won't be able to do with functions at all. For example, an API?Microservice will always be able to respond faster since it can keep connections to databases and other things open and ready.

Well, one more thing which we should consider here is that by grouping a bundle of functions together being an API Gateway, you 've created a microservice. This high-level flow remains the same as the traditional approach. The key difference is that, in case of a function, the container is created and destroyed by algorithms used in FaaS platforms and the operational team have no control over it.

Conclusion

However, with the added advantage of agility and scalability, there are many more things which serverless technologies have to offer. What one can do is always look for the best ones and make the use of these new approaches.

More Stories By Jignesh Solanki

I lead Simform's Product Engineering team by DevOps Institutionalization, end-to-end product development and consulting led transformation programs. I'd love to talk on cloud computing, mobility, security, swift and anything in between them.

@DevOpsSummit Stories
ChatOps is an emerging topic that has led to the wide availability of integrations between group chat and various other tools/platforms. Currently, HipChat is an extremely powerful collaboration platform due to the various ChatOps integrations that are available. However, DevOps automation can involve orchestration and complex workflows. In his session at @DevOpsSummit at 20th Cloud Expo, Himanshu Chhetri, CTO at Addteq, will cover practical examples and use cases such as self-provisioning infrastructure/applications, self-remediation workflows, integrating monitoring and complimenting integrations between Atlassian tools and other top tools in the industry.
"Storpool does only block-level storage so we do one thing extremely well. The growth in data is what drives the move to software-defined technologies in general and software-defined storage," explained Boyan Ivanov, CEO and co-founder at StorPool, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
Is advanced scheduling in Kubernetes achievable?Yes, however, how do you properly accommodate every real-life scenario that a Kubernetes user might encounter? How do you leverage advanced scheduling techniques to shape and describe each scenario in easy-to-use rules and configurations? In his session at @DevOpsSummit at 21st Cloud Expo, Oleg Chunikhin, CTO at Kublr, answered these questions and demonstrated techniques for implementing advanced scheduling. For example, using spot instances and cost-effective resources on AWS, coupled with the ability to deliver a minimum set of functionalities that cover the majority of needs – without configuration complexity.
As Marc Andreessen says software is eating the world. Everything is rapidly moving toward being software-defined – from our phones and cars through our washing machines to the datacenter. However, there are larger challenges when implementing software defined on a larger scale - when building software defined infrastructure. In his session at 16th Cloud Expo, Boyan Ivanov, CEO of StorPool, provided some practical insights on what, how and why when implementing "software-defined" in the datacenter.
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, which can process our conversational commands and orchestrate the outcomes we request across our personal and professional realm of connected devices.