LogoLogo
Give FeedbackGovStack Home
25Q2
25Q2
  • GovStack
  • Contributing
  • Architecture and Nonfunctional Requirements
    • 2 Introduction
    • 3 GovStack Architecture
    • 4 Building Block Design Principles and Considerations
    • 5 Cross-Cutting Requirements
    • 6 Onboarding Products
    • 7 Standards
    • 8 UX Switching and Handover
    • 9 Other Resources
  • Security Requirements
    • 2 Description
    • 3 Terminology
    • 4 Security Management
    • 5 Cross-Cutting Requirements
    • 6 Standards
    • 7 Authorization Services
    • 8 Additional Security Modules
    • 9 Other Resources
  • GovStack UI/UX Guidelines
    • 1 Version History
    • 2 Description
    • 3 Service design good practice guidelines
      • 3.1 User-centred design
        • 3.1.1.1 Understand needs and requirements
        • 3.1.1.2 Involve others in the design process
        • 3.1.2.1 Test with users
        • 3.1.3.1 Share findings
        • 3.1.3.2 Monitor performance
          • 3.1.3.3 Set up analytics
      • 3.2 Accessibility and inclusion
        • 3.2.1.1 Test for accessibility
        • 3.2.2.1 Involve a diverse user group in the design
        • 3.2.2.2 Support multiple languages
        • 3.2.2.3 Foster a culture of inclusion
      • 3.3 Consistency
        • 3.3.1.1 Use simple language
        • 3.3.2.1 Implement a consistent style guide
        • 3.3.2.2 Use design patterns
        • 3.3.2.3 Use a frontend framework
        • 3.3.3.1 Interoperability
        • 3.3.3.2 Use integrations
        • 3.3.4.1 Work in the open
      • 3.4 Technology choices
        • 3.4.1.1 Choose the right level of security
        • 3.4.1.2 Design for privacy
        • 3.4.2.1 Optimise load times
        • 3.4.2.2 Account for connectivity issues
        • 3.4.3.1 Test across platforms
        • 3.4.3.2 Design cross-channel
    • 4 Design patterns
      • 4.1 Service patterns
      • 4.2 User flows
        • 4.2.1 Register
        • 4.2.2 Authenticate
        • 4.2.3 Asking users for feedback
        • 4.2.4 Find a service
        • 4.2.5 Check a users eligibility
        • 4.2.6 Make an application
      • 4.3 Page templates
        • 4.3.1 Feedback
        • 4.3.2 Perception survey
        • 4.3.3 Satisfaction
        • 4.3.4 Before you start
        • 4.3.5 Service sheet
        • 4.3.6 Asking users for consent
        • 4.3.7 Task list
        • 4.3.8 Asking users for information
        • 4.3.9 Check answers
        • 4.3.10 Outcome
    • 5 Use-case examples
    • 6 References
    • 7 Other Resources
  • Building Blocks
    • About Building Blocks
    • Cloud Infrastructure
    • Consent
    • Digital Registries
    • E-Marketplace
    • E-Signature
    • Geographic Information System (GIS)
    • Identity
    • Information Mediation
    • Messaging
    • Payments
    • Registration
    • Scheduler
    • Workflow
    • Wallet
  • Use Cases
    • Reference Use Cases
  • Public Administration Ecosystem Reference Architecture (PAERA)
    • PAERA
  • Tools
    • Sandbox
  • Release Notes
    • 23Q4
Powered by GitBook

Apache-2.0 license

On this page
  • Set Up Analytics
  • Collect satisfaction data at the end of the user journey
  • Collect feedback throughout the user journey
  • Regularly review and act on data

Was this helpful?

Export as PDF
  1. GovStack UI/UX Guidelines
  2. 3 Service design good practice guidelines
  3. 3.1 User-centred design

3.1.3.2 Monitor performance

Was this helpful?

Once you're ready to go live, continuously monitor and evaluate the performance and usability of the service, and iterate the design accordingly to drive continuous improvement and optimise user experience.

Key performance indicators (KPIs)

Determine Must-Have KPIs

At a basic level, all services should be tracking metrics such as:

  • User Satisfaction - Overall, how satisfied are users with your service? This can be measured through surveys, feedback forms, or by analysing user behaviour (e.g., how often they return to your service).

  • Task Completion Rate - What percentage of users successfully complete the tasks they set out to do on your service?

  • Error Rates - How often do users encounter errors or difficulties when using your service?

Identify Service-Specific KPIs

In addition to these basics, each service will likely have specific KPIs relevant to its unique goals and user tasks. Identify what these are and how you can measure them.

Run a session with your team to .

Set Up Analytics

Use analytics tools to collect data on these KPIs. Google Analytics is a popular choice, but there are many other tools available. Make sure to set up your analytics tool to track the specific actions and events that correspond to your KPIs.

Collect satisfaction data at the end of the user journey

Implement a mechanism to collect user satisfaction data at the end of key user journeys. This could be a survey or a simple rating system.

Collect feedback throughout the user journey

Do not wait until the end of the journey to collect feedback. Implement feedback mechanisms at key points throughout the user journey. This could be feedback forms on individual pages, live chat options, or proactive prompts asking users if they need help.

Regularly review and act on data

Set a regular schedule to review your KPIs and user feedback. Use this data to identify issues and opportunities for improvement, and take action accordingly.

create a performance framework
See the pattern for collecting feedback
See the pattern for collecting feedback