Heuristic Evaluation
Heuristic Evaluation of (305) 360°, a Meta Quest 3-based mixed reality application.

Project Overview
Project Type
User Experience Research Methods Undergrad Project
Industry
Healthcare
Area of Focus
Usability Assessment, Heuristic Analysis, Issue Identification, Severity Rating, and Actionable Recommendations
Roles
UX Researcher
Responsibilities
Usability Assessment, Heuristic Analysis, Recommendations
Team
Team of 5: Sofia Perez-Baux, Ryan Lee, Olivia Van Bochove, Ausmita Barman, Mehdi Shah
Timeline
March 2024 / 3 weeks
Project Overview
Objective
To enhance user experience and satisfaction by identifying usability problems within (305) 360°, a VR/AR application.
Scope and Target Audience
All sections of the application were tested by five participants to ensure a wide range of feedback. The testing was conducted with the scenario that the target audience is incoming healthcare professionals at the Sylvester Cancer Center.
Methodology
Conducted a user test: each participant performed 5 tasks throughout the application with one moderator and observer assigned; Used Rubin’s method for problem impact.
Key Findings
Identified 13 usability issues ranging from visual, non-functional elements to confusing interactions on immersive videos. Provided targeted recommendations aimed at improving user interaction, visual components, and overall experience.
Conclusion
Addressing these usability issues will significantly enhance user experience, aligning (305) 360° with best practices in UI design and meeting user expectations effectively.
Product Description
(305) 360° is an AR/VR experience developed with the purpose of giving an interactive tour of the greater Miami area for prospective doctors coming into the Sylvester Cancer Institute at the University of Miami. There are four main sectors to explore: Data, Sylvester Institute Info, Lifestyle Activities, and Stories (Developed only for the Meta Quest 3 Headset).

Image depicting an expanded Data section panel, where users can access visualizations of demographics, climate, social services, and healthcare data.
Test Objectives
Research Process
1. Methodology
Heuristic evaluation
Heuristic evaluation (HE) is a method where experts use established heuristics to assess the usability of user interfaces by identifying issues during independent walkthroughs. Originating from Nielsen and Molich's principles, this process helps design teams detect usability flaws early, though it relies on evaluator expertise and may miss or incorrectly identify problems. Despite its advantages of being quick and cost-effective, heuristic evaluation is not a replacement for usability testing with real users. We applied the following usability heauristics:

Source: Interaction Design Foundation - IxDF. (2016, May 25). What is Heuristic Evaluation (HE)?. Interaction Design Foundation - IxDF. https://www.interaction-design.org/literature/topics/heuristic-evaluation
Severity Rating
Severity ratings help prioritize fixing the most critical usability issues and decide if more usability work is needed. A system with major problems shouldn't be released, but it might be okay if the issues are minor. Severity is based on how often a problem occurs, how much it affects users, and whether it is a recurring issue.

Image: https://measuringu.com/rating-severity/
Source: Jeff Rubin, Handbook of Usability Testing(1994)
2. Testing
Procedure
Our team scheduled visits to the University of Miami’s XR Studio (VESL) and began by putting on the Meta Quest 3 headset, along with receiving the controllers. The test session was estimated to last 15-20 minutes and involved navigating through each of the four information sectors: Data, Sylvester Institute Info, Lifestyle Activities, and Stories. During the session, I jotted down notes and recorded the session, which was then sent from the device to the student's email. The final HE results were compiled and organized in a spreadsheet for further analysis.
Breakdown
Tasks
During the testing, participants were asked to complete five tasks with minimal guidance from the moderator. They would notify the moderator when they believed a task was finished and then fill out a SEQ post-task questionnaire after each task. The tasks were as follows:

Task chart, with task number and description for the HE user testing for (305)360.
Task Example
Each task included a title, a prompt, and a successful completion flow. The title outlined the task's objective, the prompt provided detailed instructions or a scenario, and the successful completion flow described the ideal steps to achieve the task. This structure ensured clarity and allowed for effective evaluation of user performance during testing.

Task example, showing task 1 title, prompt, and successful completion flow.
Metrics
Task success rate
The number of participants that completed the task with no assistance from moderator.
Metrics
Task completion rate (ratio)
Failure= 0
Success with assistance= 0.5
Success= 1
Time on task
The time it took for participants to complete the task. Participant notifies moderator when they start each task by saying “start” and when they believe they have completed each task by saying “stop.”
Metrics
Task completion time(seconds)
Expected Completion
60 seconds per task
Satisfaction
The subjective level of satisfaction the participants felt about the product.
Metrics
Satisfaction scores (SEQ); 7 point rating
System Usability Scale (SUS); 7 point rating
3. Results
Usability testing revealed that 13 unique issues were found in the existing platform.

During our usability testing, we identified 13 unique issues with the existing platform. To better assess the design problems, we organized the findings into four categories: navigation, content, interactions, and system functionality.
1. Navigation
The navigation issues primarily involved confusion around accessing key features and difficulty locating exit options. Users struggled to find the Miami heat map and specific immersive videos due to unclear labeling and navigation paths. Additionally, exit buttons for immersive experiences were hard to find and not positioned at eye level, leading to user frustration. One user commented,
"I'm a bit confused about how to find the Miami heat map. I clicked on Lifestyle, but where do I go from here?"
2. Content
The main content issues were a lack of detailed information for selecting doctors, making it difficult for users to make informed choices, and insufficient space within the immersive bus experience, which hindered user movement. One user commented,
"I found it difficult to choose a doctor stories because there wasn't enough information or labeling. I think it’s a labeling issue”.
3. Interactions
The main interaction issues involved limitations in immersive videos and physical interface elements. Users found the 180° videos too stationary and lacked options to skip or scrub through the content. Additionally, the middle table's height was not proportionate to users, and the visibility of interactive elements like the stethoscope was too low, making interaction difficult. One user commented,
"I noticed that the height of the middle table seems a bit off.”
4. System Functionality
The main system functionality issues included low default sound levels in immersive videos, even when users increased the volume, and long loading times for these videos, with a delay of up to 10 seconds. One user commented,
"I'm trying to click the arts and culture tab, but it's not, it's not clicking. I'm trying to pick a museum, but nothing's happening.”
4. Redesign
Objective
Our team's objectives for the redesigning was to look for an opportunity to enhance the product’s value and effectiveness, ensuring it meets current user expectations and market standards.

Redesign example, examining the immersive video exit button.
5. Takeaways
Our (305) 360° user testing report found:
Selected Works
iCuraError Reporting System
Heuristic Evaluation of (305) 360°UX Research
