Heuristic Evaluation

Heuristic Evaluation of (305) 360°, a Meta Quest 3-based mixed reality application.

Screenshot 2024-09-01 at 3.30.28 PM

Project Overview

Project Type

User Experience Research Methods Undergrad Project

Industry

Healthcare

Area of Focus

Usability Assessment, Heuristic Analysis, Issue Identification, Severity Rating, and Actionable Recommendations

Roles

UX Researcher

Responsibilities

Usability Assessment, Heuristic Analysis, Recommendations

Team

Team of 5: Sofia Perez-Baux, Ryan Lee, Olivia Van Bochove, Ausmita Barman, Mehdi Shah 

Timeline

March 2024 / 3 weeks

Project Overview

Objective

To enhance user experience and satisfaction by identifying usability problems within (305) 360°, a VR/AR application.

Scope and Target Audience

All sections of the application were tested by five participants to ensure a wide range of feedback. The testing was conducted with the scenario that the target audience is incoming healthcare professionals at the Sylvester Cancer Center.

Methodology

Conducted a user test: each participant performed 5 tasks throughout the application with one moderator and observer assigned; Used Rubin’s method for problem impact.

Key Findings

Identified 13 usability issues ranging from visual, non-functional elements to confusing interactions on immersive videos. Provided targeted recommendations aimed at improving user interaction, visual components, and overall experience.

Conclusion

Addressing these usability issues will significantly enhance user experience, aligning (305) 360° with best practices in UI design and meeting user expectations effectively.

Product Description 

(305) 360° is an AR/VR experience developed with the purpose of giving an interactive tour of the greater Miami area for prospective doctors coming into the Sylvester Cancer Institute at the University of Miami. There are four main sectors to explore: Data, Sylvester Institute Info, Lifestyle Activities, and Stories (Developed only for the Meta Quest 3 Headset).

  • Data section: Users can explore various analytics and statistical information about Miami, including demographics, climate, social services, and healthcare.
  • Sylvester section: Users can watch immersive videos from four professionals at the Cancer Center and learn about the community outreach initiative.
  • Lifestyle Activities section: Users can explore the arts and culture within Miami, popular nature locations to visit, sports events at UM, and the neighborhoods/ZIP codes of the area.
  • Stories section: Users can watch immersive videos of local citizen’s stories that pertain to Miami in their respective environment.
  • Game Changer Bus: A navigational experience in which the user is transported to the official Sylvester Cancer Institute Game Changer Bus to learn about the outreach and community help that is achieved through this bus.
image 8

Image depicting an expanded Data section panel, where users can access visualizations of demographics, climate, social services, and healthcare data. 

Test Objectives

  1. Uncover the usability and visual design issues within the (305)360° VR/AR application.
  2. Determine the priority of issues found among all 5 participants within the usability testing study.

Research Process

1. Methodology

Heuristic evaluation

Heuristic evaluation (HE) is a method where experts use established heuristics to assess the usability of user interfaces by identifying issues during independent walkthroughs. Originating from Nielsen and Molich's principles, this process helps design teams detect usability flaws early, though it relies on evaluator expertise and may miss or incorrectly identify problems. Despite its advantages of being quick and cost-effective, heuristic evaluation is not a replacement for usability testing with real users. We applied the following usability heauristics:

Heuristic.png-2

Source: Interaction Design Foundation - IxDF. (2016, May 25). What is Heuristic Evaluation (HE)?. Interaction Design Foundation - IxDF. https://www.interaction-design.org/literature/topics/heuristic-evaluation

Severity Rating

Severity ratings help prioritize fixing the most critical usability issues and decide if more usability work is needed. A system with major problems shouldn't be released, but it might be okay if the issues are minor. Severity is based on how often a problem occurs, how much it affects users, and whether it is a recurring issue.

Screenshot 2024-09-01 at 3.43.10 PM

Image: https://measuringu.com/rating-severity/
Source: Jeff Rubin, Handbook of Usability Testing(1994)

2. Testing

Procedure

Our team scheduled visits to the University of Miami’s XR Studio (VESL) and began by putting on the Meta Quest 3 headset, along with receiving the controllers. The test session was estimated to last 15-20 minutes and involved navigating through each of the four information sectors: Data, Sylvester Institute Info, Lifestyle Activities, and Stories. During the session, I jotted down notes and recorded the session, which was then sent from the device to the student's email. The final HE results were compiled and organized in a spreadsheet for further analysis.

Breakdown

  • 5 participants met in the lab and were asked to filled a consent form and complete demographic questionnaire prior to the start of the usability test.
  • For all 5 tasks: completion time, success rate, number of usability issues, location of start of task, location of end of task and task flow was recorded.
  • After participants completed all test task, participants answered SUS questionnaire on a PDF form.
  • Participants were then asked open ended questions related to the UI elements that they saw or interacted with on the website.
  • Participants completed post study interview questions about how clear, useful and engaging the content of the website was.

Tasks

During the testing, participants were asked to complete five tasks with minimal guidance from the moderator. They would notify the moderator when they believed a task was finished and then fill out a SEQ post-task questionnaire after each task. The tasks were as follows:

Screenshot 2024-09-01 at 4.24.30 PM

Task chart, with task number and description for the HE user testing for (305)360.

Task Example

Each task included a title, a prompt, and a successful completion flow. The title outlined the task's objective, the prompt provided detailed instructions or a scenario, and the successful completion flow described the ideal steps to achieve the task. This structure ensured clarity and allowed for effective evaluation of user performance during testing.

task examples-2

Task example, showing task 1 title, prompt, and successful completion flow.

Metrics

Task success rate

The number of participants that completed the task with no assistance from moderator.

Metrics

Task completion rate (ratio)

Failure= 0

Success with assistance= 0.5

Success= 1

Time on task

The time it took for participants to complete the task. Participant notifies moderator when they start each task by saying “start” and when they believe they have completed each task by saying “stop.”

Metrics

Task completion time(seconds)

Expected Completion

60 seconds per task

Satisfaction

The subjective level of satisfaction the participants felt about the product.

Metrics

Satisfaction scores (SEQ); 7 point rating

System Usability Scale (SUS); 7 point rating

3. Results

Usability testing revealed that 13 unique issues were found in the existing platform.

Screenshot 2024-09-04 at 11.45.12 AM

During our usability testing, we identified 13 unique issues with the existing platform. To better assess the design problems, we organized the findings into four categories: navigation, content, interactions, and system functionality. 


1. Navigation

The navigation issues primarily involved confusion around accessing key features and difficulty locating exit options. Users struggled to find the Miami heat map and specific immersive videos due to unclear labeling and navigation paths. Additionally, exit buttons for immersive experiences were hard to find and not positioned at eye level, leading to user frustration. One user commented,

"I'm a bit confused about how to find the Miami heat map. I clicked on Lifestyle, but where do I go from here?"

 

2. Content

The main content issues were a lack of detailed information for selecting doctors, making it difficult for users to make informed choices, and insufficient space within the immersive bus experience, which hindered user movement. One user commented,

"I found it difficult to choose a doctor stories because there wasn't enough information or labeling. I think it’s a labeling issue”.

 

3. Interactions

The main interaction issues involved limitations in immersive videos and physical interface elements. Users found the 180° videos too stationary and lacked options to skip or scrub through the content. Additionally, the middle table's height was not proportionate to users, and the visibility of interactive elements like the stethoscope was too low, making interaction difficult. One user commented,

"I noticed that the height of the middle table seems a bit off.”

 

4. System Functionality

The main system functionality issues included low default sound levels in immersive videos, even when users increased the volume, and long loading times for these videos, with a delay of up to 10 seconds. One user commented,

"I'm trying to click the arts and culture tab, but it's not, it's not clicking. I'm trying to pick a museum, but nothing's happening.”

 

4. Redesign

Objective 
Our team's objectives for the redesigning was to look for an opportunity to enhance the product’s value and effectiveness, ensuring it meets current user expectations and market standards.

Screenshot 2024-09-04 at 11.58.41 AM

Redesign example, examining the immersive video exit button.

5. Takeaways

Our (305) 360° user testing report found:

  • 8 keepers within the application; Local Stories content, Hospital Accessibility Visualization, Lifestyle content, Public Transportation Information Visualization, Navigation Bar, Ancestry Data Visualization, Racial Demographic Visualization, and Cancer Visualization.
  • 13 usability issues within the five given user tasks; with the highest severity scores aligning with inaccurate heat map data, failure to provide user control and freedom through inaccessible exit, and poor spatial design inhibiting user to view clear content on the Sylvester Cancer Institute’s Game Changer Bus.
  • (305) 360° can improve on these usability issues applying the suggestions and redesigns our team provided.
  • Additional testing might be required as the (305) 360° application continues to expand.
  • Additional testing would also allow our team to user test with the (305) 360° intended target audience, non-local incoming medical faculty. Providing more accurate data about how the target audience is using the experience and taking in the information presented.

Selected Works

iCuraError Reporting System

© 2024 Sofia Perez-Baux
Back to top Arrow