Case Study: Boosting a 36% success rate up to 60%
Summary
Overview
I led a usability overhaul for Georgia’s Department of Revenue website, focusing on information architecture.
Through card sorting, tree testing, and first-click analysis, we identified users were ignoring what the client assumed was an answer repository for support desk users.
I tested again after eliminating the ignored section of the website and re-distributing the information based on where users were actually looking for it.
After restructuring the menu and validating changes with real staff, support desk task-success rates jumped from 36% to 60%.
Research
What was the issue to be solved?
The client serviced a variety of people but even internal office workers were having an issue finding policies to answer their website visitor questions. Due to all the services provided, they also had thousands of PDFs housed on their site.
We wanted to know:
What kept website visitors from completing tasks without calling the office?
Where were internal office workers struggling?
How confident were both sets of users in getting things done using the website?
Initially, the client wanted to know what could be done about the Rules & Policies section of the website, but I zoomed out further to include all of the site to see if the Rules & Policies section might be a symptom with a larger cause.
My Role and Responsibilities
Key Responsibilities
I served as Lead UX Researcher guiding the full discovery-to-validation process by:
Planning and conducting generative and evaluative research.
Running tree testing exercises to find-out where users were getting lost.
Running card sorting to refine the navigation through the users language.
Analyzing heat map and first click data to examine behavior.
Partnering with our designer to wireframe and implement design changes.
Leading usability retesting to confirm improvements and measure success.
Methods & Process
Process Overview
Heuristic Evaluation: I started here to identify any low-hanging fruit that could be fixed right away.
Analytics Review: This was to determine where research might begin
Talking with Stakeholders: We identified the the client’s site serviced three different audiences: General citizens who pay taxes and own cars, business owners and the clients internal employees. Although we were focused on why support desk users were not finding niche information, we didn’t want any site re-organization to negatively affect the other audiences
Recruiting: We relied on a mix of internal users given to use by the client, and purchasing a test group of general residents within the state of Georgia and business owners within the state of Georgia.
Vetting Purchased Recruits: Whenever an individual tester’s results showed a combination of unusually short answer times plus incorrect answers, we marked those as “junk data” and were given replacement testers from Optimal Workshop.
Card Sorting (Tool: Optimal Sort): Discovered that users grouped content by broad categories and internal users did not think to create a label for niche policies they might need to look up.
Tree Testing (Tool: Tree Jack): In round one we tested where users looked for answers with the current site navigation.
Tree Testing: In round two we tested where users looked for answers when we eliminated the ignored section and re-distributed all niche policies under the broader categories.
First Click Testing / Heat Maps: In round one, these revealed landing pages under the broad sections were too long. And users chose answers higher on the page even if incorrect.
Re-testing: This included new recruits for each segment, car owners, business owners and internal support desk users, plus more tree testing, and first click testing.
Heuristic Spreadsheet Sample
Card Sorting Results
What we learned about the client’s main menu categories
What I love about dendrogram charts is it’s the perfect illustration of why UX analytics are a bit of a psychological art-form that requires interpretation and compromise.
This screen grab shows what the categories look like that 60% of users agreed on. Any higher than that and the website would have too many categories instead of solving the problem.
Ultimately, the recommended menu labels for the site turned into:
Taxes, Motor Vehicles, Alcohol and Tobacco, Unclaimed Property, Local Services, About Us
Tree Test User Paths
Where were users getting lost searching for pages?
We can also look at all the paths people took.
You’ll notice in the left corner of the image below, that a lot of people answered incorrectly by all the red and orange in the pie.
But it’s very revealing to notice by the thick green line a lot of people were on the right path at some point during their journey but something mislead them to choose an incorrect answer.
Heat Map Layout Check
Where were users getting lost on individual pages?
In testing the site, we needed to also find out where users were getting lost at the page level.
We did this using a “first click test.”
In this test, we marked all the areas a person could click that would lead them directly or a level deeper to a correct answer.
The heat map image below shows a small section of a page where people clicked to find answers, although the full report would show us every click on the page.
Round One: Key Takeaways
Main Points to Address in Round Two Testing
One of the most fascinating discoveries I mentioned in the summary, we were originally asked to only focus on one section. And it turned out, everyone was skipping over that section completely.
Through testing, users also told us they look for “details”, but their test behavior showed us they actually clicked on “general” labels.
So we tested again with a new menu information architecture catering to the three audiences.
The team’s Lead Content Strategist used my initial test results and created a new proposed menu structure pictured below:
New Page Layout Suggestions:
The new page layout suggestions were conceived by our Lead UX Designer.
We structured pages so it took two steps to get them off the home page to the next step in their journey.
These were the Category and Subject pages.
And then two steps to continue on to complete a task.
These were the Service and Details pages.
Round Two: Did Changes Make a Difference? Yes!
Suggestions worked best for the Internal Group (our target group)
By eliminating the Rules & Policies section no one was visiting, and re-distributing all of the bulletins to the categories users were looking for them. This resulted in a significant 24% uptick in users who took a direct path to complete tasks successfully.
This can be seen by looking at the green paths in the images below:
Bar Chart Comparison
The light blue bars represent the average success scores for round 1.
And, the dark blue bars represent the average success scores for round 2.
Notice in the second set of bars, there is a dramatic 24% increase between the round 1 and round 2 success scores for the Internal audience. This boosted an extremely low 36% success rate to 60%. The internal audience is the same audience that triggered the usability testing and the results had a negligible impact on the other two audiences.
Evidence-based Recommendations
Eliminate the Rules & Regulations Page
We re-distributed information to the places where users were actually looking for them, instead of where the agency assumed they would look.
Shorten the Main Menu from 8 Items to 6
The new main menu items: Taxes, Motor Vehicles, Alcohol & Tobacco, Unclaimed Property, Local Services, About The “Client”
Simplify User Journeys
Shorten the number of clicks required to find a desired page.
Examine New Content Before Creating New Pages
Is this task related?
If yes, can it be added to existing instructions or should it be a new content page?
Did this question come directly from a user?
Can you start with content that exists and find out what’s keeping it from answering the user’s need satisfactorily?
Organize Pages in a Common Flow
Categories guide the user to several appropriate subjects
Subject pages can point the user to several services
Service pages can help a user complete a task or direct the user to more details
Detail pages can help a user complete a task or read the most specific information on a topic
Shorten Pages Where You Can
Users found things more easily on shorter pages.
Emphasize the most important links with an icon or button and place them near the top of the page.
Use graphs to shorten content if feasible.
Complex content like step-by-step instructions can be shortened with the judicial use of accordions.
Results & Impact
Success rates up, and tasks completed faster
The task success rate increased from 36% to 60% in the target group with impacting the additional user groups who also use the website.
Overly long landing pages were re-organized so any answer would be at most, 4 clicks away on a complex website serving 3 audiences.
Reducing top-level navigation choices and re-distributing content, immediately improved task-completion speed.
Challenges & Trade-offs
Not a true A/B test
Because multiple top-level menu items were removed after round one of testing, the information architecture changed, it was not a true A/B test.
Instead, we measured success using identical task sets and the same participant demographics in both testing rounds.
Reflection
I’m proud of our team
I’m proud of the research I did alongside a team of content editors and designers. We reached our end-goal of making rules and policies easier for the appropriate audience to find, with the bonus of giving the site editors a solid framework for when and where to add new content and a process for evaluating that content going forward.