Phase 3 - Test

Prototyping

After the initial creation of wireframes, our next step was to transform them into a functional Figma prototype. This dynamic prototype serves as a tangible representation of our design concept and allows us to simulate user interactions with the interface. Developing this prototype is a crucial phase in our design process, bridging the gap between static wireframes and a fully interactive user experience.

The Figma prototype grants us the ability to showcase not only the visual design but also the functional aspects of the Sandbox - Construction Permit Use Case interface. It enables us to create realistic user journeys, simulate user actions, and provide a hands-on experience for testing purposes.

Test Prototype with Users and Iterate Wireframes

User testing is a pivotal phase that follows the development of the Figma prototype. In this stage, we engage real users, bringing them into the design process to validate our assumptions and gain invaluable insights into their interaction with the interface.

In general, the user testing phase was a success, with participants expressing satisfaction with the wireframes and overall flow. They found the wireframes to be efficient and navigated the prototype with ease, encountering minimal issues.

We'd like to extend our appreciation to our participants for their valuable insights and feedback during the testing phase.

After creating the first version of the wireframes we started to create a figma prototype and prepared for user testing. Testing brings feedback from real users into the equation, allowing us to validate our design assumptions and to gain invaluable insights into how users interact with the interface. Through testing, usability issues, pain points, and misunderstandings were unearthed, enabling us to make informed refinements.

This iterative process not only enhances the product's user-friendliness but also reduces the risk of producing a product that fails to meet user expectations. User testing generally provides a strong insight by ensuring that our designs align seamlessly with the needs, preferences, and mental models of its intended users, ultimately leading to higher user satisfaction and success.

By conducting rigorous testing and gathering feedback from diverse participants, we aimed to refine and enhance the Sandbox - Construction Permit Use Case interface, ensuring it aligns with GovStack standards and offers an exceptional user experience.

During the testing phase, we had several key objectives:

  1. Validate Patterns -GovStack Design Patterns: We aim to validate the application of GovStack design patterns within our prototype, ensuring adherence to GovStack's established design standards.

  2. Validate Use Case Journey: We assess the entire use case journey from various perspectives. This evaluation helps us ensure that the user's journey is logical and efficient.

  3. Validate Assumptions: We seek to confirm whether our design assumptions align with users' mental models and expectations, ensuring that the interface resonates with our target audience.

  4. Assess Usability: Our goal is to evaluate the overall user-friendliness of the prototype and pinpoint areas of difficulty or confusion that may hinder the user experience.

  5. Spot Misconceptions: We identify any misunderstandings users might have while interacting with specific tasks or elements of the prototype.

  6. Find Pain Points: We actively search for user frustrations and areas that require improvement in the user experience, ensuring a smoother interaction.

  7. Test Navigation: We evaluate how effectively users navigate through the system to locate information or features, ensuring that the interface is intuitive and efficient.

  8. Collect Feedback: Gathering qualitative insights into user impressions and emotions helps us refine the design based on real user input.

  9. Evaluate Communication: We assess how effectively the interfaces and instructions convey information, aiming for clear and concise communication.

Participants:

Identifying and recruiting participants is a critical part of our testing process. The following criteria were set for participants:

  • Demographic Restrictions: We seek a diverse range of adult participants to provide a well-rounded perspective. Having participants from different countries and experiences was an important part for recruitment process.

  • Language: Participants must be proficient in English for effective communication.

  • Job: Our ideal participants include architects, city planners, and individuals who have applied for building permits in their respective countries in the past. This ensures that our testing pool reflects the real-world users of the Construction Permit Use Case.

  • Exclusion: Participants with prior knowledge or exposure to GovStack as a project are excluded to maintain impartiality.

Running the Tests:

Our testing process was structured as follows:

  • Type of Test: Remote Moderated Thinking Aloud Testing

  • Thinking Aloud Testing: Participants interact with the prototype, allowing us to observe their actions and gather qualitative data.

  • Qualitative/Contextual Interview: We conduct contextual interviews to delve deeper into user feedback and impressions.

  • Number of Tests: We conduct a total of 5 tests, with 1 or 2 internal warm-up tests beforehand to ensure the testing process runs smoothly.

  • Tools/Set-up: We utilize Figma for wireframes, Microsoft Teams for communication, and Confluence for note-taking. Roles are distributed among team members to facilitate smooth testing.

  • Time Frame: 1 Hour.

Key Takeaways from User Testing
  • Payment Fee Transparency: Participants highlighted the need for clearer information regarding payment fees, including possible estimated amounts and the automatic calculation process.

  • Identification: The process for identifying persons/entities should be improved to enhance user clarity and confidence.

  • Simultaneous Application: Consider refining the process for canceling applications within simultaneous applications for a smoother user experience.

  • Task Flow Pages Iteration: Some users found certain task flow pages confusing, emphasizing the importance of further iteration and simplification. Consistency in language across these pages is essential.

  • Task Page Design: Add consistent design elements within task pages to ensure a cohesive user experience.

  • Task Overview Labeling: Consider changing "Task Overview" to "Application Overview" for clarity.

  • Multiple Flow Versions: Develop different versions of the flow and share them with the team for collaborative refinement.

  • Review Page Enhancement: Revise the review page to align more closely with the task overview page for consistency.

  • Parcel ID Communication: Ensure that information presented on the Parcel ID page effectively communicates its automatic filling process to users.

  • Prominence of Map Option: Make the map option more prominent to enhance its visibility and accessibility.

  • Wording Refinement: Address wording issues throughout the interface for clarity and consistency.

  • Digital Signature Clarification: Revise the language and presentation of digital signature-related elements to reduce user confusion.

  • Impact of Digital Signature on Device Choice: Consider the impact of the digital signature service on both the flow and UI, as it may influence users' device choices.

  • Add to Calendar Option: Integrate an "Add to Calendar" option during the tracking of scheduled field visits.

  • Application Number Inclusion: Include the application number when an application is initiated and for additional related locations.

  • Invoice Download: Incorporate additional download invoice options and payment proof features.

  • Toast Messages for Notifications: Implement toast messages or alternative notification/error mechanisms to improve user feedback and replace disabled buttons with error notifications when applicable.

  • Burger Menu Enhancement: Revise the burger menu to allow users to track applications and consider adding a separate notification page.

We deeply value the collaboration and insights from both our users and stakeholders, which guided our iterative design process to ensure that the Sandbox - Construction Permit Use Case interface meets standards of usability and user satisfaction.