In sprints three and four we will be testing our prototypes with users.
Here's how to do it
On this page:
- What we mean by Prototype Testing
- It's the process of testing a prototype with users!
- It's not technical testing or what's sometimes called 'technical proof of concept' testing. That involves checking that it's possible to get data from a to b securely or to connect to an API or testing some other technical assumption.
- It's not usability testing. Well it's partly usability testing. Because we do want to know if they user can use this kind of solution. But it's higher level than proper usability testing. Our goal is to get feedback on the concept more the detail of the design. We are looking for answers to questions like, "Do they understand what's on offer here?" or "Is this style of navigation or content design appropriate to their needs?". We can look at questions like "Should we use a dropdown or a search box?" or "Can they browse this list of 20 items?" later.
- It's not user testing. Well, some people call it user testing. But we're not testing users. We're testing our prototyped solutions. So we prefer to call it prototype testing.
- Why we do it
- We use prototype testing to understand
- How well our solution idea meets the needs of users.
- Whether they understand our solution.
- Whether they can use it (i.e. if the flow, navigation, content formats, calls to action etc make sense to people and enable them to achieve their goal).
- How people respond to certain language, imagery or other aspects of our proposed solution.
- When we do it
- Yesterday! Or as soon as we have a prototype that will enable us to answer the above questions. If we can answer the questions with less effort, let's do that. Faffing around with fonts or image choices is a waste of effort if nobody is interested in your thing!
- Ideally we test little and often. We design a bit, we get some feedback, we repeat. However this is not always logistically easy, so we may decide to pick off a lot of questions in one or two rounds of research. That's what we're doing in this programme. It's a balance between the cost of setting up multiple sessions and the risk of spending more time designing without feedback from users.
Planning and running a testing session
Moderated vs unmoderated
- There are two types of test. Moderated (where you set your participants a task and witness live what they do, asking the occasional question) and unmoderated (when you set up a task on a website that they complete and you see a video).
- When testing early prototypes of interactive experiences we usually recommend a moderated test, because it enables you to probe on interesting themes as the session happens.
- Unmoderated tests are better later, when you just want to see if a specific piece of functionality works for people, as it allows you to do many more tests with the same resources. There are many platforms offering this service including What Users Do
- We usually allow an hour for a moderated test. 45mins is ok. 30mins is possible but pushing it.
What happens in a moderated session
- The moderator will introduce the session and ask a few open questions to understand the participants background and experience in relation to the area of focus. Here's a link to a great little video describing each section.
- He/she will introduce the participant to a scenario and set them a task "imagine you had just won the lottery and you are looking for ways to hide your money from the tax man. you've landed on this website, what would you do next?".
- As the user clicks through the prototype we ask them to 'think aloud' -"I am wondering where the link to the contact page is, usually it's in the top right somewhere". We might probe for more info too "I notice you scrolled down to the bottom and up to the top again, what was on your mind at that moment?".
- Ideally the session is recorded and someone should be taking notes.
- You may have time to look at more than one scenario in a session.
How to prep for a moderated session
- Create a discussion guide - this is an outline of the session including an introduction, opening questions, the choice of task(s) and a set of wrap-up questions.
- Get consent - We need to make sure the participant knows what they are doing and now the content created from the session (their image, voice, the details of what they say, their personal data like email and name) will be stored and managed. So before we start the session we provide a consent form and a moment for them to ask questions or share any concerns.
- Invite people (see below)
Who should be in the session
- The participant!
- A moderator / interviewer - this is the member of your team who'll be guiding the user through the process
- A prime note taker - this person is capturing quotes, detailed notes and observations
- Observers - these may be other interested members of your team. It's good to get these folks making notes and observations too. More on that below.
What if it's just me?
- If you can record the session and rewatch the video then you can do it all yourself 🙂 (and watching the recording is a good idea anyway as you always pick up more interesting stuff).
Technology for remote testing
- Must have
- Watch what the user does with your prototype
- Hear what they say about it
- Ideally see their face as they say things (facial expressions give a lot away)
- Be able to give instruction
- Nice to have
- record the session in video
- have notes added to the video timeline
- hidden observers
- There are various tools out there for remote testing.
- lookback.io is good. It allows for hidden observers. It also enables collaborative note taking on a timeline. But it costs $99 per month for the simplest plan
- otherwise you can use MS Teams or Zoom or another video conferencing platform that enables the user to share their screen (with your prototype on it) and keep their video stream going.
What about testing a mobile experience?
- Lookback has an app that the user installs prior to the call which works well.
- Zoom's app can't share a mobile screen and the user's video at the same time. But you get the audio, so that's not bad.
- Whereby can't share screen from mobile, so no good.
- Not sure about Teams on a mobile!
- The simplest solution is to have the user view the mobile experience on the desktop/laptop as that enables you to use whatever videoconferencing platform you prefer. Prototyping apps like Figma provide a mobile view that the user can interact with (see image below). This is fine for testing early concepts and content, but no good for testing the usability of a phone-based interaction, like creating a calendar event whilst on a bus.
Example Figma prototype viewed on a desktop computer.
- Who you should test with
- Aim for 5 users in each target user-group. A user-group is a group of users who you feel have similar goals and needs.