Brachial Plexus VR

About:

The brachial plexus consists of a complex network of nerves with distinct spatial relationships to each other and the structures surrounding them. This virtual reality (VR) application was created to help first year medical students better understand the anatomy of the brachial plexus.

Unique features include a custom 2D nerve map (used in the students’ curriculum) that is a part of the user interface and correlates with the 3D models in the VR application. Additionally, this VR module corresponds to the interface of a web-based module on brachial plexus anatomy that I had previously designed for the same group of students.

Role: Application design, UI/UX design, modeling

Skills: Wireframing, prototyping, user interface design, icon design, usability research & testing

Target audience:
First year medical students

Software: Adobe XD, ShapesXR, Adobe Illustrator, Zbrush, Maya, Cinema4D, Unity

Format: Exe application on Meta Quest 3

Copyrights: © UNMC iEXCEL

The application includes a unique 2D map interface that correlates the schematic depiction of the brachial plexus nerves with their spatial positions on a 3D model. When correlated with 3D models, the learning experience is significantly enhanced, as the three-dimensional perspective allows students to visualize the spatial orientation and depth that a flat image cannot convey.

This combined approach helps bridge the gap between schematic representation and real anatomical complexity, improving comprehension of nerve pathways, variations, and their proximity to surrounding structures.



Tutorial and learning features

An onboarding tutorial was created to ensure that users are able to use the application independently without the need for facilitation or supervision. I designed the tutorial to be short and concise to reduce cognitive load and minimize the learning curve for navigating virtual reality environments, catering to the needs of busy medical students.

It begins with basic motion and interaction techniques, such as grabbing and scaling, followed by more advanced functions like manipulating 3D objects and using the built-in menu options. Clear visual cues and guided prompts help users practice each action within a safe and controlled setting.

User interface design

ShapesXR was used to quickly mockup the initial layout of the VR application directly in virtual reality, ensuring that the user interface in VR matched the web-version of this application. ShapesXR was also used in planning the tutorial sequence for the application.

AdobeXD and Illustrator were used to design all icons and interface with more precision. Color palette decisions were made to adhere to WCAG AA accessibility standards. These assets were then imported into ShapesXR for use in the mockups. Using ShapesXR streamlined communication with the developer and simplified the design handoff process.

Screenshots of ShapesXR scenes showing user interaction mockups.

Screenshots from AdobeXD showing UI design planning for the VR application.

User experience design

The user experience was refined through three rounds of user testing, including participants such as medical students who were new to the app. This approach ensured that our testers accurately represented our target end users. Testers were asked to perform a short series of tasks and were observed to obtain qualitative feedback on usability.

Feedback from these sessions informed design refinements involving visual and interaction effects, aimed at delivering a seamless experience. Select examples can be seen below.

Pictures of the UI of the web version of the application that the VR version is supposed to follow.


To ensure sufficient feedback upon hovering over an item, a combination of effects were used such as a depth hover animation, color changes, sound effects, and a subtle controller vibration.

Rotation and move controls were added to enable better control and manipulation of the model.

In addition, quick view buttons are provided for quick positioning of models in common anatomical views like anterior, posterior, and lateral views.

The annotations dynamically track the user’s position and adjust their placement to ensure ergonomic and comfortable text readability.