Welcome!

@DevOpsSummit Authors: Yeshim Deniz, Liz McMillan, Pat Romanski, Elizabeth White, Flint Brenton

Related Topics: @CloudExpo, Containers Expo Blog, @DevOpsSummit

@CloudExpo: Blog Feed Post

Virtualization in Definition-Driven API Development | #CloudExpo #API #Cloud #Virtualization

The shift to an agile software development process has helped teams accelerate time-to-market

Virtualization in Definition-Driven API Development
By Ryan Pinkham

Agile development is highly iterative, and relies upon the rapid development of software and early feedback. These practices are well understood and practiced in mature application development teams.

The shift to an agile software development process has helped teams accelerate time to market, improve quality and reduce costs. With new technologies and processes that have hit the market, these same benefits can now be delivered to API development teams.

API definition on Agile teams
API definition formats, like OpenAPI Specification (formerly the Swagger Specification), have given API developers the ability to write a language-agnostic definition for their REST APIs. An API definition can be thought of a "contract" between the person/organization developing the API, and the consumers that integrate with it. A properly defined API helps eliminate the guess work that consumers often deal with when calling an API.

Defining your API with a formal API specification supports a contract-first approaches to API development, which focuses on designing the interface of your API and letting tooling generate documentation, code, and SDKs. In this approach, virtualization is key.

The importance of virtualization
Defining an API should be an iterative process. A developer will make tweaks based on how the API will actually behave when a client interacts with it. As a developer, the best way to truly understand how your API will behave, from an end-user perspective, is to create a fake version (or mock) of your API.

Utilizing a virtualization solution into your API development workflow, enables you to effectively preview how your API will behave in a given situation, solicit rapid feedback, and validate design decisions.

What does this look like in action?
Combining API definition with API virtualization allows development teams to rapidly specify and prototype, and even test their projects, all before writing any code. Putting this strategy into action requires having the right tools in place.

Read the original blog entry...

More Stories By SmartBear Blog

As the leader in software quality tools for the connected world, SmartBear supports more than two million software professionals and over 25,000 organizations in 90 countries that use its products to build and deliver the world’s greatest applications. With today’s applications deploying on mobile, Web, desktop, Internet of Things (IoT) or even embedded computing platforms, the connected nature of these applications through public and private APIs presents a unique set of challenges for developers, testers and operations teams. SmartBear's software quality tools assist with code review, functional and load testing, API readiness as well as performance monitoring of these modern applications.

@DevOpsSummit Stories
Andi Mann, Chief Technology Advocate at Splunk, is an accomplished digital business executive with extensive global expertise as a strategist, technologist, innovator, marketer, and communicator. For over 30 years across five continents, he has built success with Fortune 500 corporations, vendors, governments, and as a leading research analyst and consultant.
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereum.
For far too long technology teams have lived in siloes. Not only physical siloes, but cultural siloes pushed by competing objectives. This includes informational siloes where business users require one set of data and tech teams require different data. DevOps intends to bridge these gaps to make tech driven operations more aligned and efficient.
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
It is ironic, but perhaps not unexpected, that many organizations who want the benefits of using an Agile approach to deliver software use a waterfall approach to adopting Agile practices: they form plans, they set milestones, and they measure progress by how many teams they have engaged. Old habits die hard, but like most waterfall software projects, most waterfall-style Agile adoption efforts fail to produce the results desired. The problem is that to get the results they want, they have to change their culture and cultures are very hard to change. To paraphrase Peter Drucker, "culture eats Agile for breakfast." Successful approaches are opportunistic and leverage the power of self-organization to achieve lasting change.