Welcome!

@DevOpsSummit Authors: Dana Gardner, Yeshim Deniz, Elizabeth White, Zakia Bouachraoui, Pat Romanski

Related Topics: @DevOpsSummit, Java IoT, Containers Expo Blog

@DevOpsSummit: Blog Post

Continuous Testing, Service Virtualization... and Beer Tasting | @DevOpsSummit #DevOps

A "DevHops" Podcast

Parasoft's business partner Skytap recently invited Wayne Ariola (Parasoft Chief Strategy Officer) to be a guest on their DevHops podcast. With Skytap's Noel Wurst moderating, Wayne and Skytap's Jason English chatted about continuous testingservice virtualization, and how SDLC acceleration is impacting quality-all while sipping and reviewing three beers of their choice.

Listen to the 30-minute DevHops podcast to hear about:

  • How quality and speed are no longer in a "host/parasite" relationship
  • What's being overlooked in the user-story focused testing common with Agile
  • What the business really gets out of continuous testing
  • The myth that continuous testing = more testing or more automation
  • How the demand for SDLC acceleration is impacting quality
  • How to convince teams to take the "leap of faith" needed to trust simulations
  • The beer reviews: Abita's Wrought Iron IPA, Beck's Beer, and Georgetown Brewing's Manny's Pale Ale

Be sure to visit Skytap's blog if you'd like a complete transcript of this week's show, or if you'd like to check out previous DevHops episodes, such as Will Virtualization Beat Physical Reality, Tales from the Journey to DevOps, or How to Test for Enterprise Mobility.

Continuous Testing Book
Want to learn how to establish a continuous testing process that helps you accelerate delivery while minimizing business risk? Read Parasoft's 44-page Continuous Testing eBook today to learn how to get started. Print copies are available at Amazon.

From Alan Zeichick, SD Times
"Ariola and Dunlop nail the target: It's all about risk. That's what insurance is all about, that's what attorneys are all about, that's the sort of decision that every business and technology manager makes all day, every day. We have to live with risk and make tradeoffs. More testing? At some point, indeed, we have to cut it off.

It's difficult if not impossible to assess the business risk of software quality. Yes, software quality is expensive. The higher the quality, the more time it takes to deliver software, and the greater the resources you must spend on software quality. And yes, it is expensive to have software failures-you might lose money, lose customers, suffer lawsuits, damage your brand, end up on the front page of The Wall Street Journal. Not good...

Ariola and Dunlop make a good point in their short book: We mustn't accept that the trend toward accelerating the development process will magically improve software quality; indeed, we should expect the opposite. And if we are going to mitigate risk in today's environment, we need to reengineer the software development process in a way that considers business risk to be one of the metrics, along with the other traditional results of our automated testing and Continuous Integration systems."

More Stories By Cynthia Dunlop

Cynthia Dunlop, Lead Content Strategist/Writer at Tricentis, writes about software testing and the SDLC—specializing in continuous testing, functional/API testing, DevOps, Agile, and service virtualization. She has written articles for publications including SD Times, Stickyminds, InfoQ, ComputerWorld, IEEE Computer, and Dr. Dobb's Journal. She also co-authored and ghostwritten several books on software development and testing for Wiley and Wiley-IEEE Press. Dunlop holds a BA from UCLA and an MA from Washington State University.

@DevOpsSummit Stories
Here to help unpack insights into the new era of using containers to gain ease with multi-cloud deployments are our panelists: Matt Baldwin, Founder and CEO at StackPointCloud, based in Seattle; Nic Jackson, Developer Advocate at HashiCorp, based in San Francisco, and Reynold Harbin, Director of Product Marketing at DigitalOcean, based in New York. The discussion is moderated by Dana Gardner, principal analyst at Interarbor Solutions.
Atmosera delivers modern cloud services that maximize the advantages of cloud-based infrastructures. Offering private, hybrid, and public cloud solutions, Atmosera works closely with customers to engineer, deploy, and operate cloud architectures with advanced services that deliver strategic business outcomes. Atmosera's expertise simplifies the process of cloud transformation and our 20+ years of experience managing complex IT environments provides our customers with the confidence and trust that they are being taken care of.
Today most companies are adopting or evaluating container technology - Docker in particular - to speed up application deployment, drive down cost, ease management and make application delivery more flexible overall. As with most new architectures, this dream takes significant work to become a reality. Even when you do get your application componentized enough and packaged properly, there are still challenges for DevOps teams to making the shift to continuous delivery and achieving that reduction in cost and increase in speed. Sometimes in order to reduce complexity teams compromise features or change requirements
GCP Marketplace is based on a multi-cloud and hybrid-first philosophy, focused on giving Google Cloud partners and enterprise customers flexibility without lock-in. It also helps customers innovate by easily adopting new technologies from ISV partners, such as commercial Kubernetes applications, and allows companies to oversee the full lifecycle of a solution, from discovery through management.
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, discussed why containers should be paired with new architectural practices such as microservices rather than mimicking legacy server virtualization workflows and architectures.