Designing web and mobile apps good enough for space exploration poses a special set of UX challenges. NASA’s interaction designer reveals how he and his team take user experience into orbit
How do you prototype software that can safely send astronauts into space? Over at NASA’s Ames Research Center, the HCI team members are engaged in answering that question, constantly designing, prototyping and iterating on web and mobile apps to support space missions from launchpad to deep space and back.
Justinmind was lucky enough to talk to NASA Interaction Designer Ron Kim, who told us about how NASA’s user experience team works, the challenges of designing software for the International Space Station, and how prototyping helps him sell UX to scientists.
Can you give us an idea of an average day for a UXer at NASA? What are you working on right now?
We usually work out of our offices in the heart of Silicon Valley, just off the San Francisco Bay in between Mountain View and Sunnyvale. We have a relatively small UX design team, so we find ourselves supporting a variety of roles.
Depending on the day, my time may be spent sketching ideas, conducting user research with my clients, designing UI screen mockups and prototypes, or managing a project schedule.
I’m currently working on a team to design a new web app aimed to consolidate and dramatically simplify the process of sending scientific experiments to the International Space Station. There’s a significant amount of data that scientists and researchers need to provide when planning to send an experiment to space, and they often find themselves entering the same data multiple times across many disparate systems in the process.
By developing this tool, we’re looking to streamline that process so that experiments can get to space more quickly, which allows NASA to make more efficient use of the Space Station.
I’m also wrapping up work on designing an iOS app, the NASA Task Load Index, intended to measure subjective workload in various environments such as space flight, aircraft cockpits, military units, supervisory and process control, and simulations and laboratory tests.
Can you explain how user experience research and design works at NASA? How is the UX/HCI team organized, for example?
We’re essentially a design consultancy inside NASA that builds user-centered software for various teams across the agency. We follow an iterative process that involves user research, interaction design, and usability evaluation. Focusing on our users and their goals enables us to build the right tool that addresses both proper functionality and the interface.
Our HCI group is made up of two teams – Mission Assurance Systems (MAS), which I work in, and the Schedule and Planning Interface for Exploration (SPIfE) team, which develops mission planning scheduling software for astronaut crew members and ground controllers.
The MAS UX team primarily designs web applications to ensure mission assurance for two key organizations: ground crews supporting the International Space Station program and the Explorations System Development (ESD) program.
The ESD program is currently developing the crewed spacecraft Orion and the Space Launch System — NASA’s next-generation deep-space rocket — which, combined, will eventually send human missions beyond the Moon.
We also develop software for other groups at Ames, such as the Arc Jet Complex that performs heat shield testing for atmospheric re-entry, to improve their day-to-day operations.
The HCI Group at NASA Ames is a creative, smart, fun, and talented group of people and I feel very fortunate for the opportunity to be working alongside them.
What are some of the UX challenges you face when designing software for NASA? Is it different from designing software for any large enterprise? If so, how?
Our clients often work in very specialized areas, so acquiring the necessary domain knowledge to understand our users, their workflows and processes, and the various use cases can be quite a challenge. Spending time to talk to our users in person to gain context and understand specific problems has been something we feel is always time well spent.
Also, some of our data integration projects that share data with different systems require us to work across different NASA organizations and contractors, often with clashing cultures and differing opinions. This requires a thoughtful approach to foster cooperation and ensure that everyone’s needs are being met and the right problems are being solved.
What role does prototyping play your design and development process? What do you look for in a prototyping tool?
Prototyping plays a key role in our design and development process. Once we iterate our designs to a high-enough level of fidelity, we walk through our prototypes with our clients, click by click, to gauge the effectiveness of our designs and identify areas that need improvement.
Flexibility in importing our screen mockups and being able to quickly generate prototypes involving many screens are features that I value in a prototyping tool. For mobile platforms, I also value support for certain interactions and gestures as well as some basic animations.
We read that data and technical requirements are paramount at NASA (obviously!). Is it hard to integrate UX and design into that environment?
It can be hard to initially convince our clients, most of whom are scientists and engineers, of the value of UX and design in a highly technical environment. But after delivering solutions that address the needs of people instead of simply meeting a set of requirements, many of them now appreciate and seek out solutions from us because our products integrate design thinking.
We’ve seen how good design can guide proper attention to entering and processing complex technical data, making it more digestible. Our design outputs also help improve process efficiency and reduce human error. Designing for a technical user base also means that our tools must balance power and usability.
Additionally, because some of our software systems support critical safety processes, we can’t release them as beta software; it’s important that we deliver highly functional systems upon deployment.
When you think of NASA, you think of the cutting edge of science and tech – does that mean that, as a UXer, you’ve got to be coming up with new protocols for usability testing all the time?
Most of our projects are web applications, so we’ve mostly been making refinements to our usability testing protocol for this kind of product. As we design for new systems or updates to existing ones, we adjust these protocols to ensure new workflows are covered and that our tools meet the needs of the various stakeholders who will use our product.
We’ve begun to explore emerging technologies such as Internet of Things (IoT) and developed initial protocols to perform usability testing, but IoT, augmented reality, and other types of next-generation physical computing technologies are still relatively new areas that need to develop a bit more before they can be widely adopted.
What does the future hold for UX and space travel? Will NUIs, augmented reality or haptics play a big part in future space software and UX?
The future for UX and space travel is incredibly exciting and bright. We’re always looking at new technologies to see how they can solve problems and improve the user experience in our domain. We have a long-standing relationship with Carnegie Mellon and recently expanded partnerships with other universities to collaboratively prototype solutions over the last few
years using wearables, automatic speech recognition, Internet of Things, and augmented reality to aid procedure execution and physical tool tracking during space flight, for example.
We’ve learned a lot from exploring these HCI technologies, and once these areas reach a level of maturity for use in NASA applications, we look forward to adopting and deploying them for future solutions.