Welcome!

@DevOpsSummit Authors: Pat Romanski, Yeshim Deniz, Zakia Bouachraoui, Liz McMillan, Elizabeth White

Related Topics: @DevOpsSummit, Java IoT, Containers Expo Blog

@DevOpsSummit: Blog Post

Four Ways to Get End Users Involved in Performance Testing | @DevOpsSummit #DevOps

In user acceptance testing, software is tested in the real world by actual human beings

"You've got to start with the customer experience and work backwards to the technology." -Steve Jobs

"Websites that are hard to use frustrate customers, forfeit revenue and erode brands." -Forrester Research

"The only way to find out if a site works [with users] is to test it." -Steve Krug

We make web and mobile apps so that we can interact with our customers and users. So it's no wonder that one of the most important things you can do when you are building your site is to actually test it out with those users and make sure it works well.

We call this User Acceptance Testing (UAT) or Beta Testing, and it's long been a key stage in the waterfall software development process. It typically happens as a final check right before you release the product. The agile methodology also teaches us to include users in the development process, although agile instructs us to bring them to the table at nearly every stage of the development process. You could say that, in an agile process, users are considered partners in the creation of the app.

But UAT is often focused on functional usability: Does this button make sense here? Do you know where to click next? Is the workflow clear or frustrating? That kind of thing. So here's our question:

Have you ever considered including users in performance testing?

It may not be the most obvious thing to think about, but the benefits are really interesting. There is nothing quite like the feedback that a real user provides, even for performance. Here's how to do it.

User Acceptance Testing: Purpose and Objective
There are three main goals to user acceptance testing:

  1. Ensure the application meets the needs of users, thereby reducing development and support costs after the launch.
  2. Spot problems missed by automated testing tools.
  3. Make sure programs support day-to-day usage.

Each of these goals matters just as much to the performance of the app as to the functional usability of the app.

In user acceptance testing, software is tested in the real world by actual human beings as opposed to tools that make simulated users. This type of testing can be done by a dedicated UAT team, internal team members in other departments, or by the public. It's often wise to include all three groups in UAT, perhaps expanding your circle of testers as you get more experience with the app.

Doing the same thing with your performance test plan isn't necessarily very different from what you're doing already with user testing, but you do want to think about what you are asking of those users from a performance perspective, and how to best integrate them into the testing process. Here are some ways to do that.

Method 1: Have a Public Beta
Any web user is familiar with the concept of a public beta. Basically, you release software with a caveat. It may not work well and it may be buggy. There will be general support for users, but the implicit understanding is that you are to use it at your own risk.

Companies like Apple and Google are no strangers in applying UAT to performance testing plans. They have dedicated UAT teams and release beta versions of their software to the public. They provide resources to make it easy for users to report problems via help tickets, community forums, or even phone calls and live chat. Then they incorporate standard performance testing and monitoring processes to the operation of that public beta.
Depending on your software and your users, people might be concerned about running beta-quality software. So you may need to offer incentives to entice participation. For example, Microsoft allows users to purchase the completed version of their operating system at a significantly reduced cost as a perk for their beta testers.

Public betas are obviously a big deal and involve the whole department or company. But if you are doing it anyway, it can serve as a perfect platform to deploy all your load testing and performance monitoring infrastructure in the context of live users.

Method 2: Hold a Panel or Private Beta
This method is much more manageable and can be executed within the context of a performance team without even involving the entire development group. Select a small number of people. Next, ask them to be involved in a private beta or a product panel. Stand up your pre-release software and periodically give them tasks to do that lead them through a directed experience.

With real users accessing the app, you can easily ask for feedback. You may choose to gather feedback in a general way (How did the experience feel to you?) or you can make questions specific (How long did it take you to complete the checkout process?). Keep people involved at several points in the process, and give them small thank you gifts or discounts on the product in return for their help.

A private beta can be conducted at much less expense, in a shorter timeframe, and with far fewer interdependencies than a public beta. It's a great option when you know exactly what feedback you are looking for and when you have access to a set of customers who are excited to help you out.

Method 3: Pop up a Survey
If you have lots of users coming through an existing app already, you can run a performance test of new software on your public servers by directing a small number of those users to a sandboxed version of your pre-release software. When they first enter the site, inform them that they are being directed to a newer version of the software, and you'd like feedback from them about their experience with it. Then, at some point during their visit, pop up a quick survey and ask them to rate how the visit is going.

Don't overload them with questions - in fact, sometimes a single question asking them to rate their experience on a scale of 0-5 is enough. You may also want to ask people on the regular, production version of the site to answer the same question, so you can compare results.

This is a great method for gathering quantitative user feedback, so you can evaluate users' perceptions of the performance of the site using measurable data.

Method 4: Monitor Behavior and Metrics
Ready to secretly involve your users? Try this method. Deploy a version of your product and direct some portion of users to it. Next, compare various performance-related attributes of this population with the baseline. No surveys, no pop-ups, no user interaction whatsoever. Just see if behaviors or results improve with your performance enhancements.

Of course, this technique works best when you can focus on directly comparable tasks. For example, you could focus on the checkout process to discover if the new version results in a better outcome than the old version. The data gathered tells you if your changes were an improvement or not.

Put UAT First, or Come in Last
While automated tools definitely have their benefits and contribute much to the performance testing process, there's nothing more valuable than the experience and feedback from real users. You'll be able to deal with problems early on and save time. If you've ever tried to fix problems after the program is completely developed, you know what a nightmare it can be. User acceptance testing, even though it may involve sample subsets, can easily be extrapolated to improve the experiences of all users. Remember, the greats recognize UAT as a priority. Why not join them?

Photo: Pixabay

More Stories By Tim Hinds

Tim Hinds is the Product Marketing Manager for NeoLoad at Neotys. He has a background in Agile software development, Scrum, Kanban, Continuous Integration, Continuous Delivery, and Continuous Testing practices.

Previously, Tim was Product Marketing Manager at AccuRev, a company acquired by Micro Focus, where he worked with software configuration management, issue tracking, Agile project management, continuous integration, workflow automation, and distributed version control systems.

@DevOpsSummit Stories
So the dumpster is on fire. Again. The site's down. Your boss's face is an ever-deepening purple. And you begin debating whether you should join the #incident channel or call an ambulance to deal with his impending stroke. Yes, we know this is a developer's fault. There's plenty of time for blame later. Postmortems have a macabre name because they were once intended to be Viking-like funerals for someone's job. But we're civilized now. Sort of. So we call them post-incident reviews. Fires are never going to stop. We're human. We miss bugs. Or we fat finger a command - deleting dozens of servers and bringing down S3 in US-EAST-1 for hours - effectively halting the internet. These things happen.
Hackers took three days to identify and exploit a known vulnerability in Equifax’s web applications. I will share new data that reveals why three days (at most) is the new normal for DevSecOps teams to move new business /security requirements from design into production. This session aims to enlighten DevOps teams, security and development professionals by sharing results from the 4th annual State of the Software Supply Chain Report -- a blend of public and proprietary data with expert research and analysis.Attendees can join this session to better understand how DevSecOps teams are applying lessons from W. Edwards Deming (circa 1982), Malcolm Goldrath (circa 1984) and Gene Kim (circa 2013) to improve their ability to respond to new business requirements and cyber risks.
DXWorldEXPO LLC announced today that Nutanix has been named "Platinum Sponsor" of CloudEXPO | DevOpsSUMMIT | DXWorldEXPO New York, which will take place November 12-13, 2018 in New York City. Nutanix makes infrastructure invisible, elevating IT to focus on the applications and services that power their business. The Nutanix Enterprise Cloud Platform blends web-scale engineering and consumer-grade design to natively converge server, storage, virtualization and networking into a resilient, software-defined solution with rich machine intelligence.
CloudEXPO | DevOpsSUMMIT | DXWorldEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
The digital transformation is real! To adapt, IT professionals need to transform their own skillset to become more multi-dimensional by gaining both depth and breadth of a wide variety of knowledge and competencies. Historically, while IT has been built on a foundation of specialty (or "I" shaped) silos, the DevOps principle of "shifting left" is opening up opportunities for developers, operational staff, security and others to grow their skills portfolio, advance their careers and become "T"-shaped.