More Than Just Checklists
It's the part of the campaign execution process that always takes the longest time, yet it is also the task which is most frequently rushed or skipped. QA isn't glamorous, but it is critical to a successful launch. Surveys show that testing is the aspect of the production cycle that marketers struggle the most with. Finding the right balance is hard. Too much time testing reduces agility and increases costs, too little time testing results in errors routinely slipping into live campaigns.
Your CEO is more likely to remember that one mistake than the hundreds of successful campaigns. Failure to spot a critical error can have financial consequences for a business, but the time spent trying to find those errors is a cost that no one can afford to invest in. More likely are the reputational costs of a mistake, which can range from unnoticeable to headlines news, depending on audience and severity.
Mind the Gaps
Ultimately, time and money can only do so much. Testing a website or email nurture is a complicated business, with a thousand different items to check. Having a checklist of the important things is essential, but only if it is actually used properly. No one can remember everything that needs to be reviewed, so listing them out and checking them off one by one goes a long way to making sure everything is covered. Too many testers treat checklists as a compliance exercise and then wonder why mistakes aren't caught. It is very easy to get complacent if you complete a checklist regularly, so each round of testing should be treated as if it is the first time you have conducted testing. A checklist is only useful if each item on it is tested as they are checked off. This needs to happen every time, regardless of whether it is the first time you have completed a checklist or the thousandth time. Completing a checklist from memory often results in testers skipping or forgetting specific items on it. Inevitably, these become the errors that lead to trouble with your boss further down the line.
At the same time, it is important not to focus too much on QA checklists. Testing against the checklist is the other classic error that can lead to trouble in the long term. No checklist is ever complete, because testing a marketing campaign is too complicated to fit on a short document. A checklist can be used to confirm that the links work or the legally required small print is present. It can't be used to check if the copy reads correctly or if the layout leaves enough white space for the focus and content to be clear. These things are more about personal taste or corporate standards than objective truth. Checklists deal in the black and white of right or wrong, whereas design, copy-writing and positioning belong to the shades of grey. There is no hard pass or hard fail when it comes to reviewing copy or checking design because there is always scope to improve these things.
Multi-Stage Testing
Testing is a multi-stage process and generally requires a creative review to confirm that the campaign is likely to encourage responses, a content review to ensure that the right message is being used and a technical QA to confirm that the campaign has been configured correctly in the platforms that will be used to execute it. These are all separate processes which require a different type of analysis, often by different people. Checklists should only be used for the technical QA phase, because the design and content checks are too subjective to fit a standardised list. It's important to make sure these checks are done, and the checklist confirms this, but a checklist is no substitute for actually reviewing the final collateral from a design, product or user perspective to ensure it meets the requirements of the campaign.
The situational nature of content checks means that adding design or copy considerations to a checklist can result in a multi-page list of items to review. Human nature dictates that if your checklist can't fit on one sheet of paper, it's probably too long. Overly long checklists are the number one reason for people not wanting to fill them in. They simply slow the QA process down too much. Checklists shouldn't be about every little detail. They are a guide listing all the components that need to be reviewed. The tester should then have sufficient knowledge of what they're testing in order to understand all the nuances of each component and sufficient knowledge of the campaign to know what the correct setup for those components should be. For an email campaign checklist, it can be tempting to list out separately the sender name and from address, or the individual items that need to go into the footer - but this just adds extra overhead when the tester will be reviewing the footer or the sender details as a whole anyway. Do make sure to add a specific checklist entry for items such as the unsubscribe link though - testers frequently skip over the unsubscribe because it appears in every email, so is assumed to be correct. Yet, I have seen plenty of emails where it has been left off by the designer or corrupted whilst coding the email. The only way to catch these occurrences is to test the unsubscribe process on every email.
End to End
This illustrates the importance of doing an end to end test of every aspect of a campaign prior to launch. If you're relying on a critical marketing automation workflow or existing web page make sure they still work before going live, and ensure the end reports are showing the data they're supposed to. There's nothing worse than running an expensive campaign, only to discover that the lead capture forms you're relying on aren't working properly or are missing a critical piece of information required to link back the generated leads to your campaign. In doing so, you will also confirm that the user experience is clear and easy to use, and that is ultimately the most critical test of all. If the expected next step in a campaign journey is ever unclear, then your audience will lose interest and move on to something else. This is perhaps the number one thing to do when testing a campaign. Put yourself in the shoes of your audience, and experience it from their perspective as if you have never seen the content before and aren't familiar with its context. Retrace all the likely user journeys, and make sure they work correctly and that the end result meets the campaign objectives you've been set. If this works, then the campaign is probably ready for launch, assuming it also passes the checklist that is.