Case Study: Department of Revenue Site Usability Testing

Method

Process Overview

We identified the DOR site serviced three different audiences:

  • General citizens who pay taxes and own cars

  • Business owners

  • DOR’s internal employees

Which meant we would run 12 tests altogether. Two test types, on three audiences, with a before and after variation.

But first we started with a heuristic evaluation on accepted UX practices to identify any low-hanging fruit that could be fixed right away.

And we followed that up with site analytics to guide where our usability testing might begin.

Image of Heuristic evaluation spreadsheet.

Research

What was the issue to be solved?

Georgia’s state Department of Revenue serviced a variety of people but even internal office workers were having an issue finding policies to answer their website visitor questions. Due to all the services provided, DOR had thousands of PDFs housed on their site.

We wanted to know:

  • What kept website visitors from completing tasks without calling the office?

  • Where were internal office workers struggling?

  • How confident were both sets of users in getting things done using the website?

Initially the agency wanted to know what could be done about the Rules & Policies section of the website, but I zoomed out further to include all of the site since the Rules & Policies section might be a symptom with a larger cause.

Rules and Policies section highlighted in an chart of DOR's menu

Round One: Testing the Current Site

Card Sorting

Too often the content overall is structured based on what makes sense to the agency and not what makes sense to the users.

So we recruited users to tell us how they would categorize the site and what they would label those categories.

In the image below: The test participants drag the card over to the empty right space and assign a category name to it. Or they can drag a card into a category they have already named.

Screen shot of a card sort test

Card Sorting Results

What we learned about DOR’s main menu categories

What I love about dendrogram charts is it’s the perfect illustration of why UX analytics are a bit of a psychological art-form that requires interpretation and compromise. 

This screen grab shows what the categories look like that 60% of users agreed on.

If I moved the dotted line all the way to the left, 100% agreement would leave us with one card per category. Organizing the site into 30 categories would not be helpful.

Ultimately, the recommended menu labels for the site turned into:
Taxes, Motor Vehicles, Alcohol and Tobacco, Unclaimed Property, Local Gov’t Services, About DOR

A dendogram chart

Menu Tests

Tree Testing

Next, a tree test was set up to see if users were A.) starting in the correct place and B.) ending on the page that would allow them to complete their task. The image below shows a small sample of what that would look like.

First real-world tasks were created and the correct ending pages were selected.

Then users would go through the test marking which page they would expect to find the answer to complete their task .

Tree test example

Examining Tree Tests

How quickly and how directly were users failing and succeeding in tasks

Results from the test could show us:
1. False confidence - Going directly to an incorrect answer
2. True confidence - Going directly to a correct answer
3. Slow failure - Searching back and forth through the site only to ultimately choose an incorrect answer
4. Slow success - Searching back and forth through the site to eventually choose a correct answer

We can learn a lot by investigating all four of these situations. For example, when people go directly to an incorrect page, showing confidence in their answer, then it’s worth looking at why.

Pie chart showing successes and directness of answers

Tree Test User Paths

Where were users getting lost searching for pages?

We can also look at all the paths people took.

You’ll notice in the left corner of the image below, that a lot of people answered incorrectly by all the red and orange in the pie.

But it’s very revealing to notice by the thick green line a lot of people were on the right path at some point during their journey but something confused them or made them change their mind misleading them to choose an incorrect answer.

User paths branched out to correct and incorrect answers

Heat Map Layout Check

Where were users getting lost on individual pages?

In testing the site, we needed to also find out where users were getting lost at the page level.

We did this using a “first click test.”
In this test, we marked all the areas a person could click that would lead them directly or a level deeper to a correct answer.

The heat map image below shows a small section of a page where people clicked to find answers, although the full report would show us every click on the page.

Heat map showing user clicks on a web page

Round One: Key Takeaways

Main Points to Address in Round Two Testing

We identified the DOR site serviced three different audiences:

  • General citizens who pay taxes and drive cars

  • Business owners

  • Internal employees helping citizens

But one of the most fascinating discoveries was we were originally asked to only focus on the “Rules & Policies” section. And it turned out, everyone was skipping over that section completely!

Through testing, users also told us they look for “details” but their test behavior showed us they actually clicked on “general” labels.

So we tested again with a new menu information architecture catering to the three audiences.

New proposed site menu and audience segmentation chart

New Page Layout Suggestions:

And we structured pages so it took two steps to get them off the home page to the next step in their journey.

These were the Category and Subject pages.

And then two steps to continue on to complete a task.

These were the Service and Details pages.

Round Two: Did Changes Make a Difference?

Suggestions worked best for the Internal Group

We eliminated the Rules & Policies section no one was visiting, and re-distributed all of the bulletins to the categories users were looking for them. This resulted in a significant uptick in users who took a direct path to complete tasks successfully.

Round 1 and 2 tree chart paths showing more correct answers in round two.

The bar chart below is a comparison of both rounds of tree testing. As mentioned, the three audiences we tested were a general audience, an internal audience, and a business audience.

The light blue bars represent the average success scores for round 1. And, the dark blue bars represent the average success scores for round 2.

Notice in the second set of bars, there is a dramatic difference between the round 1 and round 2 scores for the Internal audience. This especially supports the recommendation to integrate the former Rules & Policies section into the rest of the site.

Bar chart of changes in success rates across General, Internal and Business audiences

Research-based Recommendations

Eliminate the Rules & Regulations Page

We re-distributed information to the places where users were actually looking for them, instead of where the agency assumed they would look.

Shorten the Main Menu From 8 Items to 6

The new main menu items: Taxes, Motor Vehicles, Alcohol & Tobacco, Unclaimed Property, Local Gov’t Services, About The DOR

Simplify User Journeys

Shorten the number of clicks required to find a desired page.
For example, the longer path:
Help > FAQ’s > General Registration Information > Where and When to Register Vehicles.
The shorter path:
Titles & Registration > Where and When to Register Vehicles.

Examine New Content Before Creating New Pages

New Content Question Set One:

  • Is this task related?

  • If yes, can it be added to existing instructions or should it be a new content page?

New Content Question Set Two:

  • Did this question come directly from a user?

  • Can you start with content that exists and find out what’s keeping it from answering the user’s need satisfactorily?

Organize Pages in a Common Flow

  • Categories guide the user to several appropriate subjects

  • Subject pages can point the user to several services

  • Service pages can help a user complete a task or direct the user to more details

  • Detail pages can help a user complete a task or read the most specific information on a topic

Shorten Pages Where You Can

  • Users found things more easily on shorter pages.

  • Emphasize the most important links with an icon or button and place them near the top of the page.

  • Use graphs to shorten content if feasible.

  • Complex content like step-by-step instructions can be shortened with the judicial use of accordions.

Lessons Learned

Not a true A/B test

The first round of testing was more of an exploratory discovery which resulted in us removing sections of the website and re-arranging the information architecture. Therefore at the page level, the second round of testing was not a true “B” comparison test because some of the “A” pages no longer existed.

What was good about that:

The results from the first round of testing did illuminate areas that needed a closer examination in the second round of testing.

What I would do differently on this project:

Due to the dramatic shifting of information, and re-combining it, I would test the structure first in a Round 1 or “Round 0” pilot study with a smaller group of testers. Then I would use that information to create solutions that could be tested in a two-round A/B comparison with two larger groups.

I’m proud of the research I did alongside a team of content editors and designers. We reached our end-goal of making rules and policies easier for the appropriate audience to find, with the bonus of giving the site editors a solid framework for when and where to add new content and a process for evaluating that content going forward.

Want to know more?