Helping new users start their cold water challenge on the Wim Hof app.
The Wim Hof app allows users to take on various challenges relating to the Wim Hof method, such as breathing exercises and cold water therapy. This case study focuses on trying to make it easy for new users to complete their first cold water challenge.
- User Research
- Product Strategy
- UI Design
- Interaction Design
- Usability Testing
Currently, the Wim Hof app is overwhelming for new users, with too many options being presented on the home screen. Additionally, from a business perspective, Wim Hof wants to improve the conversion rate for new users to paid members.
To solve the problem from a user perspective, me and my teammate came up with a more streamlined and straightforward flow for a new user starting their cold water challenge.
Solving the problem from a business perspective, we came up with multiple ways to opt-in for Wim Hof's premium plan on the app.
The first thing me and my teammate did was to conduct a usability review of the Wim Hof app. We did this to identify common pain points of the app, which will assist us in coming up with ideas and solutions to solve or optimize those pain points.
At the same time, we also identified "WOW" moments within the app. These are elements or moments in the app that generates a positive reaction from us, this could be anything from the illustrations used to the consistency of the color scheme. Identifying "WOW" moments also helps us decide what elements to keep or improve in our solution.
Business & User frustrations
The app didn't give clear instructions to the user for starting their cold water challenge. In other words, the onboarding process of the app was insufficient. Additionally, a lack of visual hierarchy and a confusing layout made the user want to give up easily, which negatively impacts the business.
Supporting information such as the steps completed and information about the cold water challenge was presented in a cluttered and unorganised manner, which will also make the user want to give up using the app entirely.
Competitor benchmarking is an essential step to any user research process as it gives us insights into the commonalities and differences between our product and theirs. From this, we are able to identify trends and how users interact with the products.
The problem space ensures that we can focus on coming up with a solution for our users directly affected by the problem. In this case, it's for those new to the cold water challenge. Having conducted the usability review, competitor benchmarking and further research we are able to summarise what their specific problem is:
How might we make it easier for new users to complete their first cold water challenge?
Creating a mind-map helps to put down ideas for either improving existing features or adding new ones. It's a great way to get everyone involved and get ideas from multiple views.
We did this so that the ideas can then be discussed together as a team and we can collectively decide which ideas to go forward with based on business and user goals.
Using results gathered from the mind mapping session, we applied our findings into a Crazy 8's session, where we were able to rapidly sketch out ideas in 8 minutes, with 1 minute being spent on each idea. Crazy 8’s is a great technique to put down ideas from everyone in the team in a visual form and come up with multiple solutions to the problem.
We did this so that we could come up with multiple solutions to the problem and discuss on which ideas/solutions to move forward with.
The priority matrix helps to have a crystal clear idea in which feature could we focus on first based on effort and impact. This is done to decide which features are we going to add to the app that will help increase the conversion rate.
What can we add
A clear and prominent call-to-action (CTA) that will help the user start their first cold water challenge.
What can we improve
Improving on the session setup experience to make it as easy to navigate and configure as possible.
Creating user flows helps us visualise the path the user takes to achieve their desired goal. We mapped out the current user flow, identified the potential improvements with the interfaces and created an improved user flow that will help the user achieve their goal more faster and efficiently.
With the improved user flow being finalised, we started prototyping our screens in lo-fidelity. Starting with lo-fidelity is important as it is much easier to sketch and edit our designs at this stage as opposed to editing it during the high-fidelity stage.
Styles & Components
Adopting the Atomic Design method allowed us to create a library of reusable interactive components. This is done through the use of component properties, a feature present in Figma where you can set different properties for components. Every component has a state: Default, Hover Focus, Inactive and Disabled. It's important that each state is designed to give users feedback as well as making the handover process to developers easier.
The primary palette consists of the original app's colours as we didn't want to stray away from the branding colour. This is accompanied by shades of grey that complements the palette. Additionally, we made sure to test the text colour with the primary colours following WCAG guidelines.
As for the typeface, we went with Inter as it is the industry standard and offers great readability for both lower and upper-case text.
For mobile, we used a 4-grid system to make sure our content is consistently spaced. Additionally, we followed the 4px grid system (4px rule) to ensure our interface follows a consistent spacing rule.
High Fidelity Prototype
With our lo-fidelity prototypes finalised, we created our high-fidelity prototype. Our goal with the high-fidelity prototype is to be pixel-perfect so that when we are handing off our designs to developers it will much easier for them to convert our designs to code.
As with every solution, usability testing is vital to finding out if our solution actually improves the experience and usability of the product for the user. During our testing phase we decided to go for unmoderated testing so we could collect both qualitative and quantitative user data with specific feedback in a short amount of time. This allowed us to validate and measure the hi-fidelity prototype we designed with a low cost approach.
Having tested the prototype, we learned that a majority of our users found the signing up process to be really confusing, resulting in a high error rate. This is not ideal as it results in a lot of users dropping out of the app, resulting in less active users which will impact the business. A better designed sign up process will alleviate this concern and result in more users trying out the app.
Furthermore, some users found it hard to start their first cold water challenge from the home screen. Some identified that the button was hard to find whereas other found the hierarchy of elements to be confusing. Improving on the hierarchy of the elements and making the button stand out more will increase the usability on this part.
Lastly, users felt that the flow for changing your song during a cold water session could be improved. Most notably, users were confused as to if they had to press the settings button or the music card on the bottom of the screen. This is another example of where the hierarchy of elements could be improved to give the user better visibility on certain functions.
Three key learnings
1. Collecting and analysing user data is essential in creating an accessible and usable product.
2. Visual hierarchy is important for user experience and directly affects the conversion rate of your product.
3. A seamless sign up process is essential as it influences the drop-out rate of your product.
For the next steps, we would compile all the user data we have gathered and look at the major pain points that users experienced while using the app.
We would aim to implement changes that are mentioned in the Test Outcomes section of the case study and ideate upon it. The changes we come up with will be based off the major pain points we have identified during this round of user testing.
We would then re-evaluate our hi-fidelity prototypes by going through the ideation process again and arrange for further user testing while asking more in-depth follow up questions. This is done until our design is proven to be more usable and accessible to our users.