Welcome!

@DevOpsSummit Authors: Elizabeth White, Zakia Bouachraoui, Liz McMillan, Pat Romanski, Roger Strukhoff

Related Topics: @DevOpsSummit, Java IoT, Containers Expo Blog

@DevOpsSummit: Blog Post

Four Ways to Get End Users Involved in Performance Testing | @DevOpsSummit #DevOps

In user acceptance testing, software is tested in the real world by actual human beings

"You've got to start with the customer experience and work backwards to the technology." -Steve Jobs

"Websites that are hard to use frustrate customers, forfeit revenue and erode brands." -Forrester Research

"The only way to find out if a site works [with users] is to test it." -Steve Krug

We make web and mobile apps so that we can interact with our customers and users. So it's no wonder that one of the most important things you can do when you are building your site is to actually test it out with those users and make sure it works well.

We call this User Acceptance Testing (UAT) or Beta Testing, and it's long been a key stage in the waterfall software development process. It typically happens as a final check right before you release the product. The agile methodology also teaches us to include users in the development process, although agile instructs us to bring them to the table at nearly every stage of the development process. You could say that, in an agile process, users are considered partners in the creation of the app.

But UAT is often focused on functional usability: Does this button make sense here? Do you know where to click next? Is the workflow clear or frustrating? That kind of thing. So here's our question:

Have you ever considered including users in performance testing?

It may not be the most obvious thing to think about, but the benefits are really interesting. There is nothing quite like the feedback that a real user provides, even for performance. Here's how to do it.

User Acceptance Testing: Purpose and Objective
There are three main goals to user acceptance testing:

  1. Ensure the application meets the needs of users, thereby reducing development and support costs after the launch.
  2. Spot problems missed by automated testing tools.
  3. Make sure programs support day-to-day usage.

Each of these goals matters just as much to the performance of the app as to the functional usability of the app.

In user acceptance testing, software is tested in the real world by actual human beings as opposed to tools that make simulated users. This type of testing can be done by a dedicated UAT team, internal team members in other departments, or by the public. It's often wise to include all three groups in UAT, perhaps expanding your circle of testers as you get more experience with the app.

Doing the same thing with your performance test plan isn't necessarily very different from what you're doing already with user testing, but you do want to think about what you are asking of those users from a performance perspective, and how to best integrate them into the testing process. Here are some ways to do that.

Method 1: Have a Public Beta
Any web user is familiar with the concept of a public beta. Basically, you release software with a caveat. It may not work well and it may be buggy. There will be general support for users, but the implicit understanding is that you are to use it at your own risk.

Companies like Apple and Google are no strangers in applying UAT to performance testing plans. They have dedicated UAT teams and release beta versions of their software to the public. They provide resources to make it easy for users to report problems via help tickets, community forums, or even phone calls and live chat. Then they incorporate standard performance testing and monitoring processes to the operation of that public beta.
Depending on your software and your users, people might be concerned about running beta-quality software. So you may need to offer incentives to entice participation. For example, Microsoft allows users to purchase the completed version of their operating system at a significantly reduced cost as a perk for their beta testers.

Public betas are obviously a big deal and involve the whole department or company. But if you are doing it anyway, it can serve as a perfect platform to deploy all your load testing and performance monitoring infrastructure in the context of live users.

Method 2: Hold a Panel or Private Beta
This method is much more manageable and can be executed within the context of a performance team without even involving the entire development group. Select a small number of people. Next, ask them to be involved in a private beta or a product panel. Stand up your pre-release software and periodically give them tasks to do that lead them through a directed experience.

With real users accessing the app, you can easily ask for feedback. You may choose to gather feedback in a general way (How did the experience feel to you?) or you can make questions specific (How long did it take you to complete the checkout process?). Keep people involved at several points in the process, and give them small thank you gifts or discounts on the product in return for their help.

A private beta can be conducted at much less expense, in a shorter timeframe, and with far fewer interdependencies than a public beta. It's a great option when you know exactly what feedback you are looking for and when you have access to a set of customers who are excited to help you out.

Method 3: Pop up a Survey
If you have lots of users coming through an existing app already, you can run a performance test of new software on your public servers by directing a small number of those users to a sandboxed version of your pre-release software. When they first enter the site, inform them that they are being directed to a newer version of the software, and you'd like feedback from them about their experience with it. Then, at some point during their visit, pop up a quick survey and ask them to rate how the visit is going.

Don't overload them with questions - in fact, sometimes a single question asking them to rate their experience on a scale of 0-5 is enough. You may also want to ask people on the regular, production version of the site to answer the same question, so you can compare results.

This is a great method for gathering quantitative user feedback, so you can evaluate users' perceptions of the performance of the site using measurable data.

Method 4: Monitor Behavior and Metrics
Ready to secretly involve your users? Try this method. Deploy a version of your product and direct some portion of users to it. Next, compare various performance-related attributes of this population with the baseline. No surveys, no pop-ups, no user interaction whatsoever. Just see if behaviors or results improve with your performance enhancements.

Of course, this technique works best when you can focus on directly comparable tasks. For example, you could focus on the checkout process to discover if the new version results in a better outcome than the old version. The data gathered tells you if your changes were an improvement or not.

Put UAT First, or Come in Last
While automated tools definitely have their benefits and contribute much to the performance testing process, there's nothing more valuable than the experience and feedback from real users. You'll be able to deal with problems early on and save time. If you've ever tried to fix problems after the program is completely developed, you know what a nightmare it can be. User acceptance testing, even though it may involve sample subsets, can easily be extrapolated to improve the experiences of all users. Remember, the greats recognize UAT as a priority. Why not join them?

Photo: Pixabay

More Stories By Tim Hinds

Tim Hinds is the Product Marketing Manager for NeoLoad at Neotys. He has a background in Agile software development, Scrum, Kanban, Continuous Integration, Continuous Delivery, and Continuous Testing practices.

Previously, Tim was Product Marketing Manager at AccuRev, a company acquired by Micro Focus, where he worked with software configuration management, issue tracking, Agile project management, continuous integration, workflow automation, and distributed version control systems.

@DevOpsSummit Stories
The platform combines the strengths of Singtel's extensive, intelligent network capabilities with Microsoft's cloud expertise to create a unique solution that sets new standards for IoT applications," said Mr Diomedes Kastanis, Head of IoT at Singtel. "Our solution provides speed, transparency and flexibility, paving the way for a more pervasive use of IoT to accelerate enterprises' digitalisation efforts. AI-powered intelligent connectivity over Microsoft Azure will be the fastest connected path for IoT innovators to scale globally, and the smartest path to cross-device synergy in an instrumented, connected world.
There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
ScaleMP is presenting at CloudEXPO 2019, held June 24-26 in Santa Clara, and we’d love to see you there. At the conference, we’ll demonstrate how ScaleMP is solving one of the most vexing challenges for cloud — memory cost and limit of scale — and how our innovative vSMP MemoryONE solution provides affordable larger server memory for the private and public cloud. Please visit us at Booth No. 519 to connect with our experts and learn more about vSMP MemoryONE and how it is already serving some of the world’s largest data centers. Click here to schedule a meeting with our experts and executives.
Codete accelerates their clients growth through technological expertise and experience. Codite team works with organizations to meet the challenges that digitalization presents. Their clients include digital start-ups as well as established enterprises in the IT industry. To stay competitive in a highly innovative IT industry, strong R&D departments and bold spin-off initiatives is a must. Codete Data Science and Software Architects teams help corporate clients to stay up to date with the modern business digitalization solutions. Achieve up to 50% early-stage technological process development cost cutdown with science and R&D-driven investment strategy with Codete's support.
As you know, enterprise IT conversation over the past year have often centered upon the open-source Kubernetes container orchestration system. In fact, Kubernetes has emerged as the key technology -- and even primary platform -- of cloud migrations for a wide variety of organizations. Kubernetes is critical to forward-looking enterprises that continue to push their IT infrastructures toward maximum functionality, scalability, and flexibility. As they do so, IT professionals are also embracing the reality of Serverless architectures, which are critical to developing and operating real-time applications and services. Serverless is particularly important as enterprises of all sizes develop and deploy Internet of Things (IoT) initiatives.