How NASA’s new AR tech will take astronauts to the next frontier
In an era when space exploration is venturing into uncharted territories, NASA engineers are pushing the boundaries of innovation by blending the virtual and physical worlds to support astronauts in their extraterrestrial missions.
Sarosh Nandwani, Human Systems Engineer at NASA’s Johnson Space Center, and Matt Noyes, Mechanical and Software Engineer, recently unveiled NASA’s groundbreaking work on augmented reality (AR) technology during a session of the Breakthrough Sessions 2024. Their presentation, “Joint AR: Discover New Extravehicular Activities,” shed light on how AR is reshaping the future of space exploration.
As NASA sets its sights on returning to the moon with the Artemis missions and preparing for an eventual mission to Mars, the stakes are high. Navigating through unfamiliar terrain in low-gravity environments presents unique challenges, and AR technology is poised to become a critical tool to help astronauts operate autonomously, manage their resources efficiently, and make mission-critical decisions without relying on real-time communication with Earth.
The Future of Spacewalks: AR-Enhanced Spacesuits
The focal point of NASA’s Joint AR system is to support astronauts during extravehicular activities (EVAs), commonly referred to as spacewalks. Nandwani and Noyes explained that the Joint AR project is specifically designed to enhance astronaut navigation and decision-making while on the lunar surface—an environment devoid of the familiar markers that aid navigation on Earth.
“Imagine navigating in a world where there are no roads, no signs, no distinct landmarks—just endless stretches of cratered terrain,” said Nandwani. “AR will give astronauts live, real-time information overlaid onto their field of view, allowing them to make decisions on the fly, without having to constantly check maps or data on a handheld device.”
According to Nandwani, current navigation systems for astronauts rely heavily on visual maps that take up valuable time and cognitive resources. In NASA’s field testing, astronauts spent up to 20% of their time and energy simply trying to figure out where they were and where they needed to go. With AR, this time could be cut drastically. “AR allows astronauts to see their precise location, breadcrumb trails of their path, and even straight-line traverses between points. All of this is visible in their helmet display as they move,” she added.
Innovative Solutions for a Featureless Environment
In their discussion, Nandwani and Noyes outlined the key components of the AR system they are developing. The project is built to address the specific challenges astronauts face when operating in space environments, particularly on the moon and Mars. “One of the primary functions of our system is navigation,” Noyes explained. “But it’s more than just finding your way from point A to point B. It’s about helping astronauts manage resources like oxygen levels, health conditions, and timelines in real-time.”
NASA’s Joint AR project also focuses on helping astronauts operate more autonomously—a necessity for future Mars missions, where communication delays can last up to 20 minutes. “The further we go from Earth, the more autonomous astronauts will need to become. AR helps bridge that gap,” said Noyes.
In addition to navigation, the AR system integrates several advanced features designed to improve safety and efficiency. One such feature is the “pin drop” function, which allows astronauts to mark specific points of interest—such as geological features or scientific samples—on their display. This data can then be used to revisit locations or share findings with mission control back on Earth.
Pioneering New Approaches in Software Development
One of the most significant challenges the team faced was the need to develop a completely new software engine that could run reliably in space environments. As Matt Noyes explained, NASA initially explored commercial engines but quickly found they lacked the control and safety features required for space applications. “We needed something highly reliable, low-power, and capable of handling safety-critical tasks like navigation without crashing or using up too many resources,” said Noyes.
To solve this, Noyes and his team built a custom engine from the ground up, ensuring it met NASA’s strict performance standards. “Developing the Space Technology Application Renderer (STAR) engine gave us the control we needed over memory management and performance while keeping things simple enough to work on constrained hardware,” Noyes added.
NASA’s engineers also used virtual reality (VR) environments to simulate lunar conditions and test the AR system before deploying it in the field. By creating virtual lunar landscapes using the HTC Vive VR system, they were able to simulate the AR experience and fine-tune the user interface in a low-risk setting. This innovative approach allowed them to refine the system in real-time, overcoming the technical hurdles posed by space travel.
Collaboration and Continuous Innovation
Nandwani and Noyes emphasised that innovation thrives when experts from diverse disciplines work together. The team behind the Joint AR system included hardware engineers, software developers, user experience designers, and human factors experts—all working side by side to build a solution that would integrate seamlessly with the astronauts’ spacesuits.
“The success of our project comes down to collaboration,” said Nandwani. “Having everyone in the same room—literally the same lab—allowed us to rapidly prototype, test, and iterate on ideas in a way that would’ve been impossible if we were siloed in different departments.”
Their collaborative environment enabled the team to challenge assumptions and push boundaries, leading to breakthroughs in the design and usability of the AR system. “We didn’t just settle for what’s been done before,” Noyes added. “We believed in the vision of where this technology could go, and that’s what drove us to keep innovating—even when the existing tech wasn’t quite ready for our vision.”
Looking Ahead: AR’s Role in Future Missions
NASA’s Joint AR system is still in development, but field testing has already shown promising results. Astronauts using the AR-enhanced spacesuits were able to navigate more efficiently, complete tasks more quickly, and focus more on scientific exploration rather than basic operational tasks. “It’s not just about getting from point A to point B. It’s about unlocking the full potential of astronauts to do what they do best—explore,” said Nandwani.
As NASA prepares for future Artemis missions to the moon and eventual human exploration of Mars, the AR system will play a crucial role in ensuring astronauts can work safely and efficiently in increasingly autonomous and challenging environments.
For Nandwani, the work represents a larger vision for space exploration: “We’re building the future today. AR isn’t just a tool—it’s the foundation of how we will explore new worlds.”