User testing

Carena’s patient-facing portal has only been tested internally, mainly for debugging purposes. How patients saw the intake process, as well as why they made the decisions they made were totally unaddressed.

 
why.png
 

The Opportunity

Knowing that there was a lack of quantitative data on how users were interacting with and feeling about the patient intake, I saw a opportunity to make a quick impact at a low cost and come to a base line understanding of how patients used our product.

 
 
Artboard 1 copy 2.png

Goals

The primary goal was to create a baseline understanding about how patients used and perceived the patient intake portal so that I could make recommendations for future iterations. As a company, we would be able to start using qualitative data to make informed decisions.

My role

My role in this project was from the initial proposal, to the recruiting the participants, all the way to through the testing and presenting my findings to leadership.

 
gift cards.png

Methodology

I started to look at all of the internal demographic data to learn more about who used telemedicine the most. I also checked Google Analytics to see how conversions (defined as completed virtual visits) were made, when and where. I identified where we were getting the most drop off and chose to include that in my testing scenarios.

Next, I created a usability test plan to determine test objectives and create a methodology. I created user-test scenarios based of off what I learned from the Google Analytics. Once the plan, script, and scenario were finished, I took it to legal to make sure everything was ok. Part of the recruitment process was to offer up $20 gift cards. One of the biggest considerations is that we did not use this to “bribe” the testers to use a specific health system. In order to avoid any of those issues, I used our internal testing site called “Your Health System”.

Finding a suitable semi-public location, I set up my testing site. With $20 gift cards in hand, I lined up 7 participants matching the demographic gender split of our usage data which about 70% were women. I had them sign consents and recorded their screen movements and voice.

Each test took about 20-30 minutes, and as a moderator, I would ask questions concerning their thought process and why they thought they way they did.

 

Wow, I never even thought about how scrolling attitudes were different on mobile.
— Senior developer
Artboard 6 copy 2.png
 

Testing results

Sifting through my notes and recordings, I identified emerging themes from the testers. I compiled the data into a presentation with recommendations for changes due from testing results as well as knowledge of best practices.  

 
next_steps_testing.png
 

Next steps

After the testing, I compiled my findings and put together a presentation that was shown to Leadership, the Dev team, Client Operations, and the Providers and admin team. I include my recommendations and used this as a basis for intermediate patient portal experience changes as well as future iterations.