It is always interesting to create something new. However, it’s even more exciting to observe how people are using your product. Recently, a group of enthusiasts – Galyna Kostetska (PM) and Ivan Klymenko (designer) – decided to conduct usability testing of our mobile app.
Today, Galina has shared with us her experiences, tips and lessons learned from this interesting process:
Every time we design and develop a mobile app, we talk about having usability testing in the fields to find out how good our initial ideas and interfaces are. As it turned out to make a dream a reality you don’t need that much. Actually, it’s enough for one person to get inspired by an idea and to say: “Let’s do it”! This is what happened within one of our projects when we were working on an app with catalog and advanced search/filter options.
How it all started
Before going into wireframing, we studied the business field. Then we came up with a detailed description of the typical user for our case and created a persona, described user’s needs, behaviors and goals. This was followed by brainstorming session, after which we created wireframes. Initial version – sketching – was done on the whiteboard, detailed wireframes – in Axure.
To get to the final version of wireframes we went through a couple of iterations of internal testing and feedback, involving our Customer as an expert in this business field.
I must say that a designer by himself will hardly come up with a suitable concept and wireframes that would be accepted by all parties. Involve the whole team!
After a couple of iterations, you will be able to finalize the concept and get to crash test with real users.
Why usability testing?
Was our persona relevant? Is our interface really simple and intuitive? These questions can be answered only after a little research. It was the only way to create a truly quality product. For catalog-app the speed with which user can find relevant information is very important. Our client was very cooperative and gave us an opportunity to meet with end-users of the product, spending near 40 minutes with each volunteer.
Before the testing starts you need to clearly understand how to collect and process data.
In order to collect data we created two blocks of specific scenarios for users. Each block had an equal amount of tasks, equivalent to the complexity of implementation. Here are a couple of examples:
• What is the difference between product N and M?
• What is a family of N with a frequency of 1.2 GHz?
• Find the product with 8MB of cache and Turbo Boost.
First step: we gave a device with our app and watch how fast a person completes the tasks; we took notes of all questions and difficulties. The most difficult questions were related to the feature of comparison products. Although for us it seemed so obvious before! :)Less than half of the users were used to Android smartphones, which affected the time of completing the first block tasks. After completing each task, together with the participants we assessed the complexity of the implementation of features and design on a scale from 1 to 10. After the first block was successfully performed, we answered all users’ questions and made a small training – told about the features which users had not discovered and how to use them.
Then we started the second round. As we had hoped, the results of the 2nd round were much better because users had been already specifically trained to work with the application. To complete the tasks they spend less than half the time compared to the first time.
Scenarios and timing are very important!
We had 6 users participate in the testing, which was enough to collect basic information about stronger and weaker points of app’s UI.
Frankly, we couldn’t even predict that users will make so many useful comments, and the interface that was so clear and “intuitive” for turns up to be not so simple to use by regular users.
The result was a list of new features and changes. We also understand what data is missing in the catalog, and what should be put in the first place.
Be sure it is very useful to make testing when you have the first cut down version. The ideal is to involve users in the design phase and the initial brainstorming.