Welcome!

@DevOpsSummit Authors: Elizabeth White, Zakia Bouachraoui, Yeshim Deniz, Liz McMillan, Pat Romanski

Related Topics: @CloudExpo, Containers Expo Blog, @DevOpsSummit

@CloudExpo: Blog Feed Post

Virtualization in Definition-Driven API Development | #CloudExpo #API #Cloud #Virtualization

The shift to an agile software development process has helped teams accelerate time-to-market

Virtualization in Definition-Driven API Development
By Ryan Pinkham

Agile development is highly iterative, and relies upon the rapid development of software and early feedback. These practices are well understood and practiced in mature application development teams.

The shift to an agile software development process has helped teams accelerate time to market, improve quality and reduce costs. With new technologies and processes that have hit the market, these same benefits can now be delivered to API development teams.

API definition on Agile teams
API definition formats, like OpenAPI Specification (formerly the Swagger Specification), have given API developers the ability to write a language-agnostic definition for their REST APIs. An API definition can be thought of a "contract" between the person/organization developing the API, and the consumers that integrate with it. A properly defined API helps eliminate the guess work that consumers often deal with when calling an API.

Defining your API with a formal API specification supports a contract-first approaches to API development, which focuses on designing the interface of your API and letting tooling generate documentation, code, and SDKs. In this approach, virtualization is key.

The importance of virtualization
Defining an API should be an iterative process. A developer will make tweaks based on how the API will actually behave when a client interacts with it. As a developer, the best way to truly understand how your API will behave, from an end-user perspective, is to create a fake version (or mock) of your API.

Utilizing a virtualization solution into your API development workflow, enables you to effectively preview how your API will behave in a given situation, solicit rapid feedback, and validate design decisions.

What does this look like in action?
Combining API definition with API virtualization allows development teams to rapidly specify and prototype, and even test their projects, all before writing any code. Putting this strategy into action requires having the right tools in place.

Read the original blog entry...

More Stories By SmartBear Blog

As the leader in software quality tools for the connected world, SmartBear supports more than two million software professionals and over 25,000 organizations in 90 countries that use its products to build and deliver the world’s greatest applications. With today’s applications deploying on mobile, Web, desktop, Internet of Things (IoT) or even embedded computing platforms, the connected nature of these applications through public and private APIs presents a unique set of challenges for developers, testers and operations teams. SmartBear's software quality tools assist with code review, functional and load testing, API readiness as well as performance monitoring of these modern applications.

@DevOpsSummit Stories
The current environment of Continuous Disruption requires companies to transform how they work and how they engineer their products. Transformations are notoriously hard to execute, yet many companies have succeeded. What can we learn from them? Can we produce a blueprint for a transformation? This presentation will cover several distinct approaches that companies take to achieve transformation. Each approach utilizes different levers and comes with its own advantages, tradeoffs, costs, risks, and outcomes.
This sixteen (16) hour course provides an introduction to DevOps, the cultural and professional movement that stresses communication, collaboration, integration and automation in order to improve the flow of work between software developers and IT operations professionals. Improved workflows will result in an improved ability to design, develop, deploy and operate software and services faster.
Enterprises are universally struggling to understand where the new tools and methodologies of DevOps fit into their organizations, and are universally making the same mistakes. These mistakes are not unavoidable, and in fact, avoiding them gifts an organization with sustained competitive advantage, just like it did for Japanese Manufacturing Post WWII.
The digital transformation is real! To adapt, IT professionals need to transform their own skillset to become more multi-dimensional by gaining both depth and breadth of a wide variety of knowledge and competencies. Historically, while IT has been built on a foundation of specialty (or "I" shaped) silos, the DevOps principle of "shifting left" is opening up opportunities for developers, operational staff, security and others to grow their skills portfolio, advance their careers and become "T"-shaped.
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, will discuss how this same philosophy can be applied to highly scaled applications, and can dramatically increase your resilience to failure.