Integrating unmanned aircraft systems into manned operations is a challenging balancing act of making needed changes to accommodate the unmanned systems and yet minimizing the impact of those changes to daily operations. This is nowhere more apparent than on the flight deck of a Navy aircraft carrier in which daily operations and mission events are like a carefully choreographed danced that has evolved and been perfected over the last hundred years. A dance in which a breakdown in communication can result in a slowdown of operations at best and catastrophic damage to equipment and/or loss of life at worst. As the Navy moves toward integrating unmanned operations into manned, of particular importance is developing technology that allows the aircraft directors on deck to communicate with unmanned aircraft in as near to the same manner as they do with manned. This means using the same gesture-based lexicon with which directors communicate with pilots and without the addition of more personnel on deck. In response to this need, an effort was made to develop an inertial measurement-based gesture recognition hardware/software solution. This gesture recognition system entails standard signalman wands modified by embedding an inertial measurement unit in the shaft and machine learning-based classification algorithms using the inertial data as the input to establish that communication link between director and unmanned aircraft. The system was evaluated by four current U.S. Navy aircraft directors through a series of evaluation tasks intended to emulate basic carrier deck mission events. Quantitative assessments and director opinions of the system indicated that it enabled communication between them and the unmanned aircraft to the extent that the tasks could be accomplished in a timely manner and with little change to how they guide the aircraft.
展开▼