Quick and Dirty Usability Tests
I am a fan of quick and dirty usability tests. They are more accessible (read: budget- and time-friendly) than full, formal ones, and can be done even if you are a solo founder.
Sometimes, people mistake “quick and dirty” as “free flow” i.e. just show users and get some feedback. That’s not true! In fact, running a quick and dirty usability test requires an ultra lean and focused design to make the best of limited resources. Data quality should never be compromised.
In this post, I will describe how to design and conduct a well-structured quick and dirty, no/low-cost usability test.
Step 1: Define the Goals
Start by deciding on what you are evaluating - the whole product or a specific part. For instance, if analytics show that users are dropping off during onboarding, zooming in to testing the onboarding process would be a good scope.
Step 2: Recruit participants
In formal usability tests, much attention is paid to recruiting the right participants i.e. those who are representative of the target audience or maybe even actual customers.
In quick and dirty testing, you likely recruit through informal channels (friends and family) and may not have the luxury of having a real diverse group of 100% representative participants.
How much would that impact your data quality? It depends.
The impact is minimal if you are evaluating universal features that anyone would use in the same way. For instance, one’s age or professional background has no bearing on how well one can read light grey text on white background (poor readability).
The impact is significant if specific user characteristics/knowledge is needed. For instance, testing the tax function of a tool for professional accountants will require your participant to be an actual accountant.
Step 3: Map out the ideal user paths
Mapping out the ideal user flows allow you to do two things:
clarify the expected user paths as designed
create test scenarios accordingly
This may already be documented (product spec) or may be “in your head” to be spelled out.
Step 4: Create realistic test scenarios
Based on the user flow, create test scenarios (tasks) in real-life language, such as:
“You just downloaded the app. Set up an account.”
“Add a picture to your profile.”
“You received a notification that you have a new message. Read and reply to the message.”
The goal is to break down a real-life user experience into testable chunks.
During the test, participants would carry out each task and you will assess
if they can complete the task
what they think or feel as they are carrying out the task
Be creative - the goal is to put the participant in the right mood as if they were interacting with your product on their own. One thing I’d like to do is to provide an imaginary background story, such as “Finally you are taking a vacation! You have decided to have a nice relaxing dinner, then sit down to spend this evening on planning your itinerary.”
Step 5: Plan your introduction script
Below is a sample introduction script:
Thank you for taking the time to help us today.
You are helping us evaluate <product name>. You will be given a list of things to do with it. Please think out loud along the way so I know what you are doing and thinking.
I am here to take note for the team.
There is absolutely no right or wrong answers - we are here to learn how users like yourself interact with <product> so the team can learn and improve.
Note: even for casual usability tests, it is advisable to follow the legal protocol of seeking participants’ consent. This is especially important if you are recording the session or will be using AI to capture notes.
Equally important is to include a warm up period, easing your participant into the procedure.
Step 6: Set up and Note-taking
I won’t go in details here because each test setup is unique. Goals to bear in mind:
provide an environment as close to real life as possible (sometimes that is not possible)
make it easy and comfortable for the participant
adopt a note-making method that best suits the setup and your personal preference
Step 7: Conduct the test
Your job is to observe and take notes. Unlike focus groups where facilitators steer the discussions, a usability test facilitator should be as invisible as possible. The participant takes the lead and ideally should act as if they were on their own.
If a participant faces difficulty (e.g. lost in navigation), do not jump into “help” immediately. This is valuable data and a great time to ask an open question such as “You look puzzled, could you tell me what’s on your mind?”.
You may discover users do not follow the user flows - they may even get distracted and wander off task. Again, this is valuable data as it represents how a user may experience your product in real life. When and if appropriate, gently prompt them to get back to the task list but do mark down the digression as an observation.
Take-aways
You don’t need a lab or a budget to learn from your users. With thoughtful design and a little creativity, even a scrappy usability test can reveal where your product shines - and where it falls short of expectation. So start small, stay curious, and keep testing.