The final post in our quality assurance series will discuss how we test the actual user experience – if you press a button, does it do what you expect? What happens if you press button A, then button B, then button A again? As you can imagine, there are many different combinations of scenarios that need to be tested.
As we have been developing DVR Examiner 3.0, we’ve done it with the plan of implementing a suite of automated user interface tests. These tests step through predefined workflows and make sure that the buttons and other components of the user interface do what they’re supposed to do. Since this is automated, we can run through hundreds of different scenarios every time we make a change or update something.
With more automated user interface testing in place, it frees our team up to do more meaningful manual testing. Instead of merely testing whether button A works, which can be accomplished through automated means, they can test how entire processes feel to the user. They can focus on questions like “is it tedious to utilize a particular feature?” and “are there buttons that feel out of place?” which an automated user interface test would never care about.
It is sometimes easy to become hyper-focused on testing each individual component and lose sight of the goal of producing a quality product. In fact, a common joke in the field goes something like this:
A QA engineer walks into a bar. Orders a beer, orders 0 beers, Orders 99999999999 beers, orders a lizard, orders -1 beers. Orders a “;kljkljsakljfdsakljfds”.
The first customer walks into the bar and asks where the bathroom is. The bar bursts into flames.
Our goal as a company is to produce a quality product and avoid the bar fires whenever possible. This takes a great amount of effort by our development and quality assurance teams, but also relies on the rest of our team to document and communicate issues experienced by users. We’re also very lucky to have users that have been a tremendous asset when it comes to reporting issues, providing data, and testing resolutions. When the issue occurs in one video frame on an entire 2 TB drive (true story), it isn’t always possible to predict that issue, but we are able to resolve issues like this in large part thanks to the support of our users – so thank you!
We’ll be moving on from quality assurance in our next post, but stay tuned for more behind the scenes info in the Dev@DME series soon. Until then, stay safe and healthy!