This paper details the development and testing of a new heads-up display (HUD) for use in extravehicular activities (EVA) in performing maintenance tasks. Existing research indicates that augmented reality can be used effectively to guide users in autonomously completing on-Earth tasks (such as occupational training or maintenance) and to enable collaborative work between geographically-isolated users. As human exploration of space extends beyond low Earth orbit into deep space, communication latency between the spacecraft and ground support may require astronauts to make critical repairs and decisions independently of Mission Control. Considering the complicated procedural nature of EVAs and the need for astronauts to maintain high situational awareness (SA) throughout the task, incorporating a HUD into future EVA systems can enable greater autonomy for astronauts. An outcome of greater autonomy is that the SA workload will be transitioned from mission controllers onto the astronauts themselves. However, while it is necessary that astronauts have sufficient SA of their suit systems and health, it is important that they do so without significantly increasing their workload or sacrificing task performance. To accomplish the goal of increasing astronaut SA without increasing workload or decreasing task performance, a HUD was developed using the Microsoft HoloLens platform and modified based on feedback from EVA operators and former astronauts. The HUD utilizes both visual and auditory interfaces to present suit parameters to the user while allowing for user customization of the visual layout. During testing, a set of EVA procedural tasks were presented to the user in the virtual space, incorporating animations and pictorial representations of the task steps. The augmented reality system guided the user, giving them detailed instructions on how to perform maintenance on a mock-ISS EVA. Results suggest that while using the HUD platform increases task completion time, it
展开▼