Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Usability testing allows you to observe firsthand how users interact with your product. You can identify any challenges they encounter and understand the 'why' behind their behaviour. This improves user experience and can prevent costly redesigns later on.
Further reading and resources for usability testing include:
Continuously improve services to meet the needs of users. GovStack applications may involve users in several different roles and affiliations with several organizations.
Understand needs and requirements
Involve others in the design process
The cloud-based analytics solution, such as Google Analytics, is a straightforward and easy-to-setup method that offers robust data about your users' behavior. It's a beneficial choice for those who want to get started quickly and without much technical setup.
Setup: Begin by creating an account on the platform of your choice. You will then add the tracking code they provide to your website. Ensure this code is embedded on each page you wish to monitor.
Configuration: Within the platform, you'll set up goals or events that align with your Key Performance Indicators (KPIs). This allows the software to track specific user actions that are of interest to your business.
Monitoring: Once the setup is complete, you can start monitoring user behavior data through the platform's dashboard.
Remember to respect user privacy throughout this process, which involves informing users about the data you collect, why you collect it, and offering an option to opt out.
Self-hosted analytics, such as open-source platforms like Matomo or Plausible, offer more control over your data and are often favored by businesses that place a high emphasis on data privacy and security. This is particularly important if you value keeping data in-country due to regulatory or compliance requirements.
Setup: You'll need to set up these platforms on a server, which could be owned by you or rented from a hosting provider. After this, you'll add the platform-specific tracking code to your website.
Configuration: As with cloud-based solutions, you will need to define the events or goals that align with your KPIs.
Monitoring: Once configured, you can use the platform's dashboard to track user behavior and monitor your KPIs.
Server space considerations for self-hosting depend on several factors, including the amount of traffic your website receives and the level of detail in the data you are tracking. As a starting point, 1GB of space could handle over a million simple page views, but more complex tracking would reduce this. Consulting with a server or IT professional could provide a more accurate estimate based on your specific needs.
It is also possible to start with a cloud-based solution for quick setup and immediate insights, then transition to a self-hosted solution as it's being set up. This allows you to benefit from analytics data right away, while your more robust, privacy-centric solution is being prepared.
Find opportunities to collaborate closely with users, stakeholders, and other , throughout the design process.
Empower users to take an active role in co-creating and co-designing public services.
Involve stakeholders early on to understand their expectations and objectives.
Get feedback from stakeholders to review and comment on design decisions and findings.
Conduct workshops or brainstorming sessions for diverse input.
Have peer reviews to get the perspective of different roles in the design.
Emphasise user needs and project goals within the team.
Carry out user research activities, like interviews and surveys, to understand user needs.
Hold co-design sessions with users, for them to participate directly in the design process.
Conduct with real users to identify issues and opportunities for improvement.
post-launch.
Start by understanding the needs and requirements of the solution, including users' needs, expectations, and pain points. Consider the "Person" and the "Role" are not the same. For example, the same person may use a health care application as a doctor and also as a patient but the needs of a doctor's UI/UX are different from that of a patient, and while the doctor may work on the data of multiple patients, a patient can access only self-data. You can find more examples of how to understand user needs in the implementation playbook.
Understanding user needs begins with user research. This includes techniques like surveys, interviews, user testing, and analysis of usage data. The goal is to understand the tasks users are trying to complete, the problems they face, and the goals and motivations of users in specific roles.
Always question assumptions about what users need in specific roles. Just because something is commonly done or seems like a good idea doesn't mean it is what users need. Validate every assumption with data.
Before jumping into solutions, make sure you have correctly framed the problem. Ask: What user need is this solving? Why is this a problem for our users? How do we know this?
Not all user needs are equally important. Use data from your research to prioritise features and improvements based on what users need most.
Share research findings with team members, senior members or strategic leaders, and even the general public whenever practical.
The aim is not just to share information but also to generate dialogue and collaborative action based on the findings.
Organise Your Findings
Begin by grouping your user insights, key takeaways, and suggestions. This could be grouped by themes, user groups, and stages in the user journey.
Create a Simple Presentation
Document your findings. Each slide could represent a key finding or insight. Remember to use clear, simple language and include visual aids where possible to increase understanding.
Schedule a Playback Session
Invite team members and stakeholders to a meeting where you'll share your findings. Make time for discussion.
Document and Share
Share the presentation along with any additional notes from the discussion. This ensures that everyone has access to the information and can refer back to it in the future.
You might even consider publishing findings openly through a blog or similar format.
Once you're ready to go live, continuously monitor and evaluate the performance and usability of the service, and iterate the design accordingly to drive continuous improvement and optimise user experience.
At a basic level, all services should be tracking metrics such as:
User Satisfaction - Overall, how satisfied are users with your service? This can be measured through surveys, feedback forms, or by analysing user behaviour (e.g., how often they return to your service).
Task Completion Rate - What percentage of users successfully complete the tasks they set out to do on your service?
Error Rates - How often do users encounter errors or difficulties when using your service?
In addition to these basics, each service will likely have specific KPIs relevant to its unique goals and user tasks. Identify what these are and how you can measure them.
Run a session with your team to create a performance framework.
Use analytics tools to collect data on these KPIs. Google Analytics is a popular choice, but there are many other tools available. Make sure to set up your analytics tool to track the specific actions and events that correspond to your KPIs.
Implement a mechanism to collect user satisfaction data at the end of key user journeys. This could be a survey or a simple rating system.
See the pattern for collecting feedback
Do not wait until the end of the journey to collect feedback. Implement feedback mechanisms at key points throughout the user journey. This could be feedback forms on individual pages, live chat options, or proactive prompts asking users if they need help.
See the pattern for collecting feedback
Set a regular schedule to review your KPIs and user feedback. Use this data to identify issues and opportunities for improvement, and take action accordingly.