LATVIAN STATE CHANCELLERY
User testing the new unified government platform
The Latvian Government created a centralised platform for all government agency websites. Along with the platform they want to modernise and standardise the look and feel of all government websites.
Context
Each government website looks different. Each had its own design, information architecture, and how they are hosted and maintained. This creates inefficiencies in the maintenance and overall cohesiveness of the government platform.
Objective
Create a unified user experience across all government websites that would have a consistent and responsive design language and an information architecture based on user research and testing.
Challenges
We didn't have much information on who or why, mostly Google Analytics data and task completion time for a few of the websites
Finding a unified information architecture that would suit websites with just a few pages (like a small region municipality) or many pages like the Department of Labour
Creating tasks that would be broad enough that allow us to gain insights across all government websites but precise enough that we could see how people interact with specific features
Gathering enough users for testing from a broad enough user base that would cover most use cases
Remote organisation of user tests, especially for mobile
The time constraint of 1 month for planning, preparing the prototype, recruitment (done by the client), testing, analysis and sharing findings
Scope
For this project, I was tasked with 2 main duties:
Building a prototype from a design that is being developed in parellel
Leading the user testing efforts
My role
I was tasked to plan, prepare and lead the second round of user testing. The first round was carried out by a consultancy firm that did the preliminary research and our job was to create a prototype and test them with real people. Some of my responsibilities:
Create a plan for the second phase of UX research
Create the prototype that will be used for testing
Prepare documents explaining what research activities to do and how to do them
Consult on organising user tests and focus groups
Help in analysing research results and extracting insights
Present research findings to the advisory board
Building the prototype
I took designs done by an agency and built a clickable prototype that matched our test scenarios.
Testing needed to happen for the website as well as the mobile version. I created a prototype that supported 8 different user journeys to be used in user testing.
A continuous challenge was to adapt the design content to make the scenarios and tasks feel realistic.
Finding the appropriate content on existing websites was challenging at some points even before user tests reflected a similar sentiment.
Research goals
To know what to look for, you have to set out a target. In this case there were 3 main topics of research.
Test the Information Architecture
Test the new site structure to see how easy it is to find some of the most common search queries that were gathered from an earlier user survey.
Interest in new features
We were also introducing new functionality like side-scrolling on mobile devices and a chatbot that should help customer service people. This is done because a lot of times users call one number but have to be redirected to other people.
Consolidate the new brand
With the new branding, users were able to know on which government website they were 100% of the time. That is because each had their name and logo in the top-middle of the page.
100% of users could: Find the name of the agency on which website you are on now.
Technically, everybody is the user
We did not have any Personas so we turned to different municipalities and agencies to find information on users. We found that there are many different kinds of users with different objectives that differ depending on the type of government agency in question.
Asking for help to do it right
Based on feedback from different agencies, we went with casting a large net and creating a survey through which people needed to sign up to participate in the study.
A total of 40 users signed up of which 20 were selected.
We had 1 week to recruit people and 1 week coordinate a test session for the 3rd week. We then selected users based on the frequency of visiting government websites (user has visited a government website at least once in the last month) and their current profession.
We selected 40% of government employees that work at different levels and have either a lot or very little exposure to actual users seeking help.
Another 40% were private sector workers also at different levels - from customer support to managerial positions.
And the last 20% were students, unemployed people as well as senior people.
Only 20% of users thought that: Yes, current government websites have been useful.
In-Person user testing
As the facilitator, I explained how the test would be organised. I followed a pre-prepared script I made to inform that we're testing the platform, not the user and that it would be nice if the user would think aloud.
Users answered a short pre-test survey so we understand more about the users’ internet usage habits
If you can't find information about something, would you rather call the company, use the Internet or write to the company to get the answer?
Has the information on the government website you visited been enough to answer the question you had?
And others..
Tasks
Instead of asking people to find a piece of navigation, they were instructed to complete real-world scenarios like:
“What would you do if you can't find what you are looking for?” This was to understand if people would be willing to use the chatbot or would default to finding the contacts section.
“You are looking to renew your passport, how would you accomplish this?” Usually, this would be a regular text page but the new website would allow you to apply for an appointment online.
90% of the time: People call to get more information, instead of looking at the website.
Post-test survey
In the end, user participated in a post-test survey to better understand their experience and to get a general sentiment about the direction we're heading. Some insights:
Liked that the menu is on the left side, not the top
Understandable navigation
Easy to find frequently necessary information
Design workshop
Only 20% of stakeholders found current websites useful.
The purpose of the Design workshop was to get together stakeholders who would be in charge of managing the new platform and to gather as many different perspectives on the necessary functionality as possible. These are the same people who are currently managing content for government websites.
Organisation
We gathered 15 people and split them into groups of 3. To have more groups and more possible insights and as well to have them small enough so that a consensus would be easier to come to within the group.
To start, we created constraints to let people know what we are expecting of them:
What are we here for? To get a perspective on the necessary functionality.
To what extent are we doing that? Just on an idea level, to understand which parts of current functionalities are being utilised and to find what is missing.
What do we expect from each group? A consensus on what they want.
Finding consensus
Following introductions was a workshop where we split people into groups and iteratively asked each to come up with ideas, present them and vote on the top ideas that then are iterated upon like this:
Individually write down problems with the current site
Each presents their case
Everyone votes on top ideas
Each group gets a problem to solve
The group presents their solutions
Everyone votes on the best solutions
Solutions are taken by other groups and iterated on
Changes are presented
Everyone votes
User testing insights
The overall sentiment was very positive and the new design approach seems to resonate well with users
Takeaways
Users appreciated that the new design has more white space and consistency throughout all subpages as it helps with legibility.
Yes, current government websites can be very colourful and lead to unexpected other sites. People were able to locate themselves well and for the most part, navigation seemed to make sense.
Most users were able to perform the tasks without errors, although some did amount to quite a few errors, mostly on the mobile website.
Other findings
The originally suggested information Architecture was not sufficient for larger sites and made people very confused as a 2nd level navigation had 11 entries
Longer lists of documents or search results lacked some sorting and filtering options
Buttons confuse due to a lack of consistency in the prototype
Because of the government branding, some links were ignored
Based on analytics and the pre-test survey, “Contacts” is the most used page mainly because people find it more convenient to talk with a person.
About 70% of users on mobile understood how side-scrolling works but mentioned that should be easier to understand
Users did not want to use the chatbot stating that usually it is an automated response and you can never know when a person will answer
Workshop insights
This exercise allowed us to get together and align with different stakeholders.
It was very valuable to hear what they see as issues that we can address in the second iteration of design and testing.
Some of the main addressable points were:
Show each agency’s competencies. People should be able to know if they can find what they are looking for from the home page.
A powerful search feature is a must-have. It is an often-used feature.
Services provided by the government should also be findable on the front page (similar to gov.uk).
Contacts should be easy to find and in the same place for all websites.
Go a mobile-first approach because traffic from mobile devices is increasing steadily.
An adjustable background image for websites to personalise their home page.
Design proposals
There are 3 different types of government agencies with different needs. After user testing, it was determined that only 2 types of information architecture need to be supported - large pages and small pages.
Content should be left to each agency with shared guidelines and some flexibility in what types of materials can be used. Some users enjoyed the customisation part, theming their sites for special events and seasons.
Final thoughts
User testing always shows insights that project stakeholders might have missed.
Testing with users it valuable
Taking the time to create a user research plan helps and sharing it with stakeholders helps keep everyone focused on the same goal. It also allows everyone to experience and learn about the process. In the end, it provides clarity to any new people entering the project and consensus amongst the project group.
New behaviors can be tricky
Testing new features or behaviors is very useful because you can not only learn if users understand them but also provide factual information that trumps stakeholder assumptions.
Continuous delivery is great
Regular deliverables allow us to keep things on track and increase accountability amongst all participants. Having too many deliverables in a short time frame can make the focus switch from a user-centric approach to a client-centric approach.
Anyone can have valuable insights
Involving stakeholders in all steps of research helps to raise the product IQ and allows to get better insights on the questions where the knowledge is solely held by the client.