Wellbeing Assessment
My role
Product Designer
Duration
3 months
Tools
Figma
Miro
Team
1 Product Design Lead
2 Product Designers
1 UX Researcher
1 Content Writer
1 Product Manager
What is the Wellbeing Assessment?
A four-pillar questionnaire designed to tailor the Alight Worklife experience to users' needs
The Wellbeing Assessment covers Mind, Body, Wallet, and Life, personalizing the platform based on responses to offer tailored content, recommendations, and action items for better work-life balance over time.
The Problem
The assessment is a lengthy questionnaire, with some pillars asking for sensitive information, increasing the likelihood of user drop-off
Given the nature of the questions, it's likely that a significant percentage of users might not complete the assessment. However, gathering all this data is crucial for the algorithm to deliver a more personalized and engaging experience, which enhances user satisfaction, and potentially attracts new clients.
User Research
Is this something people are willing to do? Surprisingly, yes
At first, we were hesitant about the product. The assessment was lengthy, asking questions
about users' lives and habits, leading us to assume that people wouldn't complete it.
A survey with 200 participants had already been conducted, and after reviewing the results, we
found that despite the nature of the questionnaire, users were likely to complete it if
it provided personalized content.
82%
stated they would likely take part in a finance and health survey if the reward was to receive personalized content
Strategy
The process of designing a form can be messy, so I proposed we follow a strategy based on four key pillars
The plan was to integrate each section sequentially into a draft. However, early on, I noticed issues with some
questions and the structure. If we continued with the original plan, similar problems would likely arise as we progressed, leading to repeated discussions
with stakeholders and revisiting approved mockups, ultimately wasting time.
Building on Forms that Work by C. Jarrett and G. Gaffney, I proposed a new approach to the team, outlining a plan based on four key pillars and
clearly explaining the goals and tasks for each stage.
π€
Relationship
Encouraging people to participate and asking the right information
π¬
Conversation
Simplifying questions, writing clear instructions, selecting appropriate form controls, and streamlining the form flow
π¨
Appereance
Refining details and making the form look easy
π
Testing
Testing the previous work and iterating based on feedback
Relationship and Conversation
The first two pillars required close collaboration across multiple teams, involving discussions to align business and user needs
Relationship and collaboration were the key pillars of this project, as it required working with the content and core structure of the assessment.
Throughout these stages, we worked closely with management, content, accessibility, and the design system teams to reach agreements that would benefit the final product.
Key wins:
Defined objectives and goals
Using the three rules that influence response rates, we defined key areas to focus on to enhance user experience and boost completion rates
Enhanced question quality
Although the management team was initially opposed to changing the questions, our analysis and proposals convinced them to update several
Collaborated to the design system
Analyzing the existing controls led us to propose a new one for specific answers, resulting in its addition to the design system
Redefined assessment structure
Streamlined the form by dividing it into small topics, prioritizing anticipated questions, and placing less intrusive questions before more personal ones
Appereance
With the core content and structure defined, we assembled all the pieces and designed the assessment's UI
After refining and defining every element of the assessment, we assembled everything to design the interface, which would later be tested with users. Since the assessment used simple components and required a straightforward layout, the design was completed quickly.
Setting up expectations
For the introductory screen, we aimed to set clear expectations by communicating:
- The purpose and benefits of completing the assessment, which based on research, would encourage participation.
- The time required to complete it.
- The topics covered.
Optimizing form usability
Each section of the form was deeply analyzed and designed to help users complete it easily and without any frictions.
- Sensitive questions were meant to include specific explanations, but due to complications, we used a general description of the pillar to convey its purpose.
- Insights were requested by the business, and sometimes provide valuable feedback on user input. Positioned to the side, they prevent distractions and clutter in the main content.
Granting users control
The assessment is organized logically, but we gave users the power to choose the order of pillars to suit their convenience, increasing the chances of full or near-full completion.
- Each time a section is completed, the screen reinforces the benefits of finishing the assessment.
- Primary buttons highlight the task to complete the remaining pillars.
Summarizing the next steps
The final step acknowledges the effort made and outlines the next steps and benefits received.
- Since information may change, users will see this screen upon their return, so the option to retake a pillar is provided.
Usability Testing
With the first version of the assessment, two usability studies were conducted, each with distinct objectives
The first study focused on identifying usability issues, unclear questions, and evaluating the
clarity of navigation, the value of insights, and the reasons why users would take or complete the
wellbeing assessment.
Key findings:
- Clearer benefits would increase participant motivation.
- There was a set of questions that were still not entirely clear.
- Transition pages were useful.
- Few insights were valuable; the rest were ignored or deemed irrelevant.
The second study explored user behavior and reactions to sensitive questions, aiming to understand
why some users might not complete the entire assessment.
Key findings:
- Participants were willing to complete it, but sharing personal information might affect accuracy and completeness rate.
- The assessment length was deemed appropriate.
Final Design
Based on the test findings, we revised our initial version to address the identified issues
All findings were listed and prioritized by their impact. Through multiple iterations and feedback sessions, we developed the final version.
Entry point
- Removed distractions and condensed the text into a list to clearly highlight the benefits of completing the assessment.
- Clarified how information will be used to build trust and address concerns about sharing sensitive content.
The sections
- The pillar descriptions was condensed to better communicate each section's purpose.
- Unclear questions were revised based on feedback.
- Since some insights were not always useful but still required, their prominence was reduced by shrinking their size and allowing users to choose whether to view them.
Prefilled questions
- To reduce effort, questions that can be automatically filled using existing user information on the platform are prefilled.
Summary menu
- Enhanced the text to better clarify the benefits of completing the assessment.
- Maintained clarification on how information will be used to minimize the chance of it being overlooked.
Success screen
- Increased the prominence of the main next step to make it more noticeable.
- Enhanced the description to clarify all the next steps and the benefits of completing the assessment.
Next Steps
Unfortunately, the project was deprioritized and not launched, but our work was showcased in client meetings to highlight our consumer-grade design and critical thinking
- With the second version, we planned another round of testing to confirm the effectiveness of our changes. When the project resumes, this will be the first step.
- At the beginning we ideated ways to encourage the completition of the assessment outside of it, but as the request was only on the assessment, we left those ideas for a future state.
- We aimed to specify the reasons for each private question to enhance transparency, build trust, and increase completion rates. If we had more supporting data and resources, this would be a valuable improvement to consider.
What I Learned
Developing a plan before tackling the problem to minimize project friction
- Without a clear strategy, the project would have been chaotic, with wasted time revisiting earlier stages due to issues found later.
- Documenting agreements is helpful for reviewing the reasons behind decisions if needed later.
- Even if stakeholders have a clear vision of the final product, clearly communicating the rationale behind decisions helps them understand the reasoning, reconsider their opinions if necessary, and provide valuable insights for improvement.