When you’re commissioning a new Umbraco site, it can be useful to start planning for the task of testing what has been delivered by your developers. A planned and thoughtful approach to User Acceptance Testing can reduce the time taken and ensure better coverage of the system as well as one that is ultimately better at the end.
During the early product planning stages, it’s worth discussing acceptance testing to understand the agency’s standard approach. Will they develop the system in an iterative way and expect you to contribute to acceptance testing during the project, or will they instead wait until the system is complete and expect Acceptance Testing of the whole system? It’s also useful to understand what testing environment they’ll provide, and what data will be loaded into the system. Another useful discussion point is whether the testing environment is for your exclusive use or if it’s used for other testing or development activities.
You should also agree what your strategy for Acceptance Testing will be. What areas of the application you’ll be testing, and how tolerant of errors you can be. For example, you may allow the site to go live with a small number of failed tests, but some failures could be much more critical - and potential showstoppers.
Once the strategy is agreed it’s worth planning the actual tests that will be executed. The more planning that can be done in advance, the simpler life becomes when the rubber hits the road during testing. We use our Test Planning template to support clients working through Acceptance Testing. The approach we recommend is to methodically work through the Prioritised Requirements List (PRL) and Traceability Matrix to identify the key requirements of the system and then identify tests that evaluate if each feature within this has been implemented as expected.
Start with positive tests, ones that check that if the user follows the expected behaviour, the system responds in the way defined in the PRL.
You should end up with at least one test for each requirement in the PRL - more if your requirements defined multiple options or journeys. You may also have multiple tests for the same requirement if a feature is available to different classes of user, or if there are different types of data involved. For example, when testing our PulseMove fitness system for Pulse Fitness recently, we had a requirement for fitness equipment users to record an individual exercise. This evolved into two different tests for Cardio equipment versus Strength equipment as they are quite different types of data.
Once all the positive tests have been planned, you can think about negative testing, which is testing to make sure that the system can deal with users misbehaving. You will more than likely end up with many more negative than positive tests - this makes sense as there’s usually many more ways for users to get things wrong than to follow the right path.
When planning out all of your tests you should aim to describe the scenario you are testing - e.g. add new cardio exercise to workout, then describe in as much detail as you can how you’ll do this.
If you’ve been especially diligent and planned your acceptance testing right after writing your PRL in the first flushes of a project, you’re unlikely to be able to describe the actual test steps in detail. That can be done later, but don’t skip this step as if you describe the test steps in sufficient detail you should be able to delegate your testing.
It’s also important when writing the tests to plan all the data you’re going to need. If you’re upgrading an existing site full of user data then you might want to have access to that for testing - remembering that you probably need to arrange for your developers to cleanse the data of any personally identifiable information unless system testing is allowed under your GDPR privacy notice. You should also plan on a test-by-test basis for which individual data items you’ll use during testing, e.g. test adding exercises on User 1234 on 12-Aug-2018 using exercise type Static Bike.
Lastly, for the planning phase, make sure you describe in detail what the expected behaviour is for the test. For positive tests, you’ll be describing the planned outcome of your scenario, e.g New exercise appears on User’s timeline in website and mobile app, plus summary activity info for user and their gym is updated. For negative tests the expected behaviour will be how you expect the site to deal with the site - e.g. if user doesn’t have a medical certificate on file in the Webapps, an error will be generated preventing further use of the equipment.
We find it helpful to update the Traceability Matrix with the user acceptance testing, so that we know the system has been comprehensively tested by us and the client.
Finally, after all this planning you can get on with actually executing tests. Use the test plan to track the progress of testing. If a test exactly follows the expected behaviour in the test plan then it is a pass, otherwise, it’s a fail. All fails should be tracked on the plan, with the issue reference or bug reference your devs give to you.
Of course, sometimes you’ll execute a test and while it does match the expected behaviour, you may realise that the test was wrong. Here, it is worth having a conversation with your team, updating the expected behaviour and executing the test again.
We provide our clients with a copy of our test plan template to enable them to plan and track their user acceptance testing. We like to use Google Sheets but are open to other options.
Next time you commission a new website make sure you contact us to find out more about how we capture and control the testing of Umbraco projects. And be sure to take a look at our ‘What happens when…my Umbraco agency doesn’t have good testing processes?’ blog post for more on this as well.
Lastly, if you’re in the throws of having made bad buying decisions last time you bought an Umbraco Site perhaps because the testing wasn’t well controlled you might want to consider using our Health Check service to get things back under control.