AR Display for Astronauts
2x national finalists in the annual NASA SUITS Challenge
TIMEFRAME
Sept - May 2025
Sept - May 2024
Sept - May 2023
ROLE
Chief Designer
UI/UX Team Lead
UI/UX Designer
TOOLS
Figma, Illustrator, Magic Leap 2
DISCIPLINES
Design systems, Usability testing, Project management
The Problem
01
How might we design a streamlined process to execute mission-critical tasks, helping astronauts overcome the challenges of safely navigating the moon ?
Imagine you are an astronaut on a spacewalk. You made the decision to explore that ominous crater. Mission control alert: your oxygen level is nearing 0%. You panic a little. Moondust slips out of your gloves. You need to get out of there, but you’re stranded at the edge of a 5-meter crater on the South Pole of the moon.
Human error accounts for approximately 70% of accidents in high-risk domains according to NASA’s Human Factors group. We can’t afford to lose anyone braving the new frontier.
Project Timeline
Mission
For the third year in a row, I am competing in NASA’s Spacesuit User Interface Technologies for Students (SUITS) to design and develop a user interface for NASA astronauts to establish a sustained human presence on the Moon and Mars. Extravehicular activities (EVA), or spacewalks, play a vital role in these missions and the pursuit of deeper space exploration.
-
Leadership: Su Hyun Ahn (PM), Linlin Yu (Chief Designer), Danielle Kim (Chief Dev), Jason Silva (Chief Dev)
Design: Anna Wang, Nina Chang, Anthony Zhang, Alayka Seputra, Waverly Huang, Anika Gupta, Catherine Huang, Rumei Zha, Ariana Kim, Kian Park, Eldoris Cai, Zhenmi Tang, Sheryl Lee, Haoxuan Huang, Ava Maghsoodlou, Anushka Parikh, Bennett Graff, Ella Goodman, Mia Haake, Olivia Petrarch, Jin Gu, Sandy Hong, Chahek Bansal, Richa Lin
Dev: Seik Oh, Ryan Lee, Roger de Mello Koch, Lin Ning Kung, Feiyue Zhang, Ahad Bashir, Shivam Higorani, Hongwei Liao, Taylor McMillon, Wilson Vo, Yue Zhou, Haohan Wen, David Man
Faculty: Prof. Michael Lye
-
Leadership: Michael Wang (PM), Keya Shah (Chief Designer), Martin Ma (Chief Dev)
Design: Linlin Yu (Web Lead), Sunjoo Park (AR Lead), Anika Gupta, Alayka Seputra, Anthony Zhang, Waverly Huang, Elaine Zhang, Richard Cheng, Amy Ai, Anna Wang, Nina Chang, Kiran Mukherjee, Sheldon You
Dev: Seik Oh, Jason Silva, Ryan Lee, Jamie Chen, Julius Beberman, Mandy He, Jiayi Fan, Feiyue Zhang, Zijing Xu, Yixuan Liu
Faculty: Prof. Michael Lye
-
Leadership: Jessica Young, Michael Wang, Ashley Fan
Design: Linlin Yu, Keya Shah, Ryan Lee, Bill Xi, Pei-Jung Hsieh, Dong Yoon Shin, Bryce Yao
Dev: George Xu, Danielle Kim, Martin Ma, Jamie Chen, Julius Beberman
Faculty: Prof. Michael Lye
Meet the Team!
Skye Ray, NASA Evaluator
Stakeholders
Our AR program is built to help NASA design evaluators assuming the role of an astronaut. Evaluators will test our program in a simulated environment called the Rock Yard, which mimics conditions on the moon.
Why Augmented Reality?
Interfaces displayed in AR can provide real-time data directly in astronauts’ fields of view. This includes navigation paths, geological points of interest, or hazard warnings. This reduces fear and removes the need to consult separate devices, allowing astronauts to stay focused on the following tasks:
Navigation
Guides user across the lunar surface, avoiding hazards.
Lunar Sampling
Shows scientific info of lunar geology picked up during EVA.
Rover Commanding
Monitors the autonomous rover to survey moon surface.
Egress
Prepares suit to transition from pressurized homebase onto the moon.
Main Goals
02
Show less information, more confirmation.
Our previous interfaces required tediously tapping buttons for access to info, which overwhelmed users and blocked their vision during testing. However, the previous year’s research helped us start off strong.
Design with physical limitations in mind.
The lunar landscape limits the ability to walk. "The current-generation suits are not designed for tasks requiring repeated bending or kneeling, causing astronauts to adopt awkward postures that increase the risk of injury during extended extravehicular activities." A HUD shouldn’t be an added burden.
James N.
Former Astronaut
Steve S.
Retired Astronaut
Peter S.
Geological Sciences
Jonathan L.
Cartographer
Isabel T.
UX Designer
Alejandro R.
VR/UX Specialist
Jim H.
Geological Sciences
James R.
Planetary Sciences
Minimize life or death scenarios, and always have a backup ready.
Confidence drives decisions in a life-or-death situation. I studied previous interviews with eight specialists and formed three main insights:
-
“What’s closer to the user is more important, which means it should be higher in the visual hierarchy”
-Romero
“Be as minimal as you can. It’s not great for all controls to disappear [on the AR display], but it can help with organization”
-Levy
-
“With bulky gloves, there's no tactile feedback”
-Swanson
“The main challenge is the gloves because they are airtight and large, so mobility is tough”
-Torron
“Use bigger hand movements!”
-Romero
-
“A procedure list and suit status are necessary”
-Torron
“A checklist relies on memorization and the current one looks like a poorly designed book”
-Newman
Prototyping
03
I learned how to write effective callouts on our designs and create a clickable prototype so developers can better understand our 3D decisions through our 2D lens.
01 Sketches
“How do we design a moving display in a 3D space?” Draw it on paper and hold it at arms length as you walk.
02 User Flows
“How do we divide work but ensure a linear sequence for the user? Brainstorm key features in 4 teams & regroup.
03 Wireframes
“How do we validate our ideas to user testers and devs?” Lay out lo-fi frames into one clickable wireframe.
04 Clickable Prototype
“How do we make a minimum viable product?” Iterate for 3 months and implement hi-fi design into Unity.
Usability testing
04
We conducted usability research by asking college faculty to walk through our clickable prototype. We asked them to “think out loud” and had to intervene/skip tasks due to a lack of briefing on the simulation. We were gathering feedback from complete beginners of AR technology so it was impossible to have them pretend to be an astronaut on the first try, but their pain points regarding UI elements were unanimous.
Matthew B.
Senior Critic, RISD Industrial Design
Cheeny C-R.
Assistant Professor, RISD
Leah B.
Assistant Professor, RISD
Main Insights
-
“I don’t know what’s real and not real…Am I supposed to click on this?”
-Bird
“What’s the red triangle? Warning? Arrow? Volcano?”
-Celebrado-Royer
-
“It would help to merge ROVER command into the navigation map, since both you and the vehicle are navigating”
-Bird
“The key thing is about making the buttons consistent in word and icon choice to make it easily understandable every time”
-Celebrado-Royer
-
“It’s difficult to see the white icons. There’s so many”
-Beeferman
“Icons have no outline. Words are too small, line weight is too thin. Warning signs & top right notifications are too big and block my view”
-Bird
“The symbols aren’t exactly intuitive. I need a tooltip.”
“How do I keep track of what I need to do next?”
Lo-Fi to Hi-Fi
05
Below is our High-Fidelity Prototype!
I designed for AR
I worked with two designers and two developers to implement Geo Sampling. Throughout the year, I learned to create Figma assets, reviewed the accessibility with developers, made revisions to usability, and uploaded them into Unity.
I also designed to support AR
I led a subteam of 4 designers and worked with two developers to implement a mission control dashboard to support the astronauts out on the field. This was a high-level web application that consolidated every task the EV had to undertake, keeping track of EVA progress.
Task Management
Map Obstacle and POI
Repair Assistance
Geo Photo and Voice Recording
Egress/Ingress
Design System
2023-2024
2022-2023
User Testing
06
“It’s very finnicky”
To extract errors in our design and code, we conducted in-environment testing called “Human-In-The-Loop (HITL)” at two local state parks to simulate the moon environment.
“I wish there were backup options for when things fail”
My team and I were invited to Houston to test & present our design at Johnson Space Center from May 18th to the 23rd.
Rover
Rock scanner
UIA panel
Watch us present our design!
2023 VS 2024
07
Menu
✦ Flip left hand to open shortcuts
Egress
✦ Tasks are auto-checked off once the switch is flipped
✦ Before proceeding to the next set of tasks, the user must click confirm
Nav
✦ A head tilt brings the compass into sight
✦ Place a waypoint or hazard on the map
Rover
✦ Drop a point of interest on the map to move the ROVER
✦ Call back ROVER on the palm menu
Geo Sampling
✦ Enter sampling session via menu
✦ Scientific info is collected upon scanning with the RFID hand tool
✦ End sampling to start navigation
Key Takeaways
01
Understand hardware and software limitations for developers’ and user’s sake
The HMD reads big motions better than precise finger taps
Create a linear process to avoid confusion
02
Know my responsibilities and take responsibility to learn teammates’ ideas
03
Note the opportunities for next year’s challenge…
Goal #1:
Add backup options - functions WILL fail during testing (ex. we need to research hand tracking to ensure the palm menu works every time)!
Goal #2:
Implement voice commands - This could help give more agency to our user in the form of a hands-free tool!
RISD SUITS had a blast in 2023 and 2024. We are currently designing our 2025 interface!
Goal #3:
Divide and conquer - keep each team member accountable for their own part of the project!