Tag: Human-Machine Teaming

RAeS Article: AI, Autonomy, and Human-Machine Teaming Become Rising Forces of Change in Aviation

By Shawn Weil, Chief Growth Officer, Aptima, and Member, AIAA SciTech Forum Guiding Coalition; and Scott Fouse, Aerospace R&D Domain Lead and AIAA SciTech Forum Executive Producer, AIAA

Originally published in the November issue of RAeS AEROSPACE

 

Aviation is where artificial intelligence (AI) and human-machine teaming with autonomy has become mainstream, beginning initially as assistance for fighter jets in terrain avoidance in the 1980s. Today we see it in commercial jet operations with automatic co-pilots.

Now it’s time to expand AI-powered machine-learning autonomy systems in aviation. There’s an opportunity to focus on applications training to help guide the human-machine teaming for improving aircraft performance and reducing pilot risk.

Engineers are tweaking the relationship between pilot and AI to address concerns about autonomy in a more complex aviation system. The early models of autonomy in aviation have given way to a much more nuanced and complex view of autonomy, where tradeoffs can be made in different ways for different circumstances.

While that autonomy work is still in its infancy, the development of vehicles with autonomous system operations actually dates back to the 1940s in the automobile industry. A blind automotive engineer invented cruise control, an autonomous system that began the process of a broader, more adaptive autonomous system for automobiles. Today, fully autonomous taxi cabs are roaming the streets of San Francisco, aggregating data and updating machine learning capabilities as they transport fares across the city.

Aviation engineers are taking a cue from the automotive industry and exploring the lessons learned about the process of AI-powered autonomy. How should it be implemented in a plane? How should a pilot interface with it? How can a pilot know and anticipate what the autonomy is actually in control of and what the pilot is responsible for?

To address those questions, engineers have honed their systems interfaces, re-designed workflow, and created other externalized cognition tools built on data collection and aggregation. Ultimately what engineers want to create is an autonomous system where they have both high levels of automation and high levels of control. They want to make autonomy a full-fledged member of the flight team that is in some sense omniscient because it’s receiving information from more sensors than the human partner could.

AI won’t replace people in the cockpit. But it may, in fact, amplify their efforts.

SciTech24_eventsMeanwhile, aviation engineers are trying out new ways of using autonomy. One example the U.S. Air Force (USAF) is exploring is the concept of the automated wingman. Here, a piloted aircraft might be flying with two or three autonomous aircraft around it, anticipating the pilot’s next move. These drones are not just reacting to the pilot’s commands, but they’re reacting to the pilot’s intent. They can anticipate the pilot’s actions. The USAF is currently ramping up plans for using 1,000 autonomous drones to assist jetfighters, calling them collaborative combat aircraft.

Aviation engineers are beginning to understand that perhaps the whole science of human autonomy interaction from a cognitive systems point of view has to be rethought. Training should be created to help the pilots know more than just how to fly the aircraft, but also how to manage, understand, and anticipate the autonomous system.

Now the real work in AI-powered machine-learning autonomy for aviation begins. Learn more during the 2024 AIAA SciTech Forum, 8–12 January, Orlando, Florida. A number of panel discussions and technical papers presented throughout the forum will help the aviation industry move the autonomy and human-machine teaming work forward.

Op-Ed: Human-Machine Teaming Key to Aerospace Engineering’s Digital-Driven Future

By Scott Fouse
AIAA Domain Lead for Aerospace R&D; Lockheed Martin Space Systems (Retired); Fouse Consulting Services

Originally published in the November issue of  RAeS’ AEROSPACE

Machines that can surpass human intelligence? Today’s artificial intelligence (AI) systems promise to outmatch certain aspects of human brainpower by developing thinking skills of their own. From digital twins to digital threads, aerospace systems are increasingly complex and AI-dependent. How do we find systems engineers (SEs) equipped to manage this complexity – while ensuring our growing reliance on AI doesn’t replace us? Tapping into the full potential of the data revolution for aerospace systems requires us to leverage the best of machine learning models and human ingenuity by harnessing the strengths of each to achieve what neither can do alone.

The world has changed dramatically since I began working in AI and human decision support in 1984. The computational horsepower and the world’s access to data have been game changers – and will no doubt lead to revolutionary breakthroughs in the next 5 to 10 years. While some elements of machine perception have exceeded human performance, one area where machines lag is recognizing and managing context – an area in which humans excel.

Great SEs have been exposed to multiple kinds of systems and bring all that experience to bear when tackling a new problem. How do we tap into SEs who have the right blend of experience in today’s fast-evolving AI environment? To achieve this needed expertise in future aerospace systems will require human-machine teaming (HMT). The new AIAA Transformative System Engineering Task Force is looking at how we accelerate the skillsets of this new breed of engineers. This will be the topic of many sessions during the 2023 AIAA SciTech Forum in National Harbor, Maryland, 23-27 January 2023.

Successful collaboration between humans and intelligent machines depends largely on trust. But as you apply human trust to machines, you begin to anthropomorphize them, asserting other human-like capabilities that aren’t there. It’s better to view AI not as a human, but rather as a different kind of contributor with unique characteristics.

Additionally, as the aerospace community embraces AI, we must ensure we don’t lose the art of design. Instead, we must leverage AI to amplify our ability to do design. Back in 1998, I worked with a car manufacturer that had embraced digital design. They had become reliant on their automated design tools after 10 years and they had lost the art of design, meaning they could only produce designs that the tools allowed them to do. This carmaker’s experience serves as a wake-up call for the aerospace community to avoid similar mistakes on its own AI journey.

The future of HMT is both about changing the way we engineer the system, and the way we operate the system. And AI in the cockpit is one of the factors that makes the systems engineering process so complex. Making AI systems more transparent, perhaps through explanation, will allow SEs to train alongside these intelligent systems. In the case of aircraft (or spacecraft) pilots, they’ll develop instincts about how to work more effectively with this kind of intelligent support.

As we get more effective at developing digital system models, we’ll become better at modeling and characterizing risk. We already are seeing promising potential with digital twins and digital threads, which allow us to recognize potential design issues earlier in the process.

People are now looking at designing a digital twin of a human-machine team. Instead of only monitoring sensors on an aircraft, for instance, the digital twin also would model a pilot’s workload and performance. If the digital twin sees that the pilot is experiencing overload, some tasks could be taken off the pilot’s plate, with the AI acting as an intelligent co-pilot.

Reasons to Attend the 2023 AIAA SciTech Forum!In 20 years, this will be the future of aerospace operations: leveraging human-machine teaming to accelerate design and engineering processes from conception to operational capability, resulting in lower costs for ever-more complex systems.

To learn more, attend the 2023 AIAA SciTech Forum, 23–27 January 2023, in National Harbor, Maryland, and online. Digital Day on 26 January will feature a keynote address from the Acubed team on their work in digital twins and digital engineering followed by multiple panels tackling topics such as complex adaptive systems engineering and human-machine teaming.

About the Author
Scott-D-FouseAs AIAA’s Aerospace R&D Domain lead, Scott Fouse helps ensure that the U.S. aerospace community focuses on key technologies that will affect future generations of aerospace platforms. His four decades in the aerospace industry have predominantly focused on how Artificial Intelligence (AI) applies to military decision making. Fouse has led three R&D organizations, including directing both the Advanced Technology Labs and Advanced Technology Center for Lockheed Martin. Today, he serves as principal of Fouse Consulting Services, focused on helping companies identify and create technology-enabled futures.

Human-Machine Teams Need a Little Trust to Work

Panelists: Moderator Bill Casebeer, senior research area manager, Human Systems and Autonomy, Lockheed Martin Advanced Technology Laboratories; Julia Badger, Robonaut project manager, Autonomous Spacecraft Management Projects, NASA’s Johnson Space Center; Eileen Liu, research scientist, Human Systems and Autonomy, Lockheed Martin Advanced Technology Laboratories; Matthias Scheutz, director, Human-Robot Interaction Lab, Tufts University; Victoria Coleman, chief technology officer, Wikimedia Foundation; Michael Casale, chief science officer, STRIVR

by Michele McDonald, AIAA Communications Manager

Work in machine intelligence crosses such disciplines as neuroscience, cognitive science, cognitive architectures, theory of mind, user experience design, human behavior modeling, systems engineering and explainable artificial intelligence. But, human-machine teams hold promise in helping with space exploration, training and boosting human performance.

The thorny issue of trust is at the heart of human-machine teams, panelists discussed Jan. 11 during the “Human-Machine Teaming” session at the 2018 AIAA SciTech Forum in Kissimmee, Florida.

Our own expectations of what machines can do, the roles machines play on a team and trust in them — too much or not enough — can get in the way of teamwork, panelists said.

Trust doesn’t come automatically, said Victoria Coleman, chief technology officer of the Wikimedia Foundation.

“In reality, there is no magic bullet,” she said.

Coleman said repetition helps as does how machines arrive at decisions.

“Trust is something you establish … then it’s a cycle,” she said.

But, machine intelligence is finding a home in NASA and sports.

NASA is using smart machines that will work with crew members as the agency plans space stations for future missions, said Julia Badger, Robonaut project manager with Autonomous Spacecraft Management Projects at NASA’s Johnson Space Center. These complex space stations sometimes will work with crew members and other times will be on their own to solve problems, such as leaks, and maintain the station, she said.

Virtual reality and immersion are helping large companies train employees, said Michael Casale, chief science officer with STRIVR, a VR training company. Top-level athletes use the technology to help them improve their performance, he said.

The panelists said technical advancements are needed for humans and machines to become effective teammates. For example, humans and machines don’t speak the same language, and nuance is difficult for machines to process, explained Eileen Liu, a research scientist with Human Systems and Autonomy at Lockheed Martin Advanced Technology Laboratories

“I might say something, but I might mean something else,” she said.

Liu said humans are good at thinking creatively and solving problems on the fly but that we fail at repetitive tasks because our minds wander. She said that’s where machines excel and can help. But first, Liu said, they need to figure us out and understand how moods affect attention spans. Sensors may help machines get a handle on our state of mind, she said.

Humans and machines need to share mental models to become effective teammates, said Matthias Scheutz, director of the Human-Robot Interaction Lab at Tufts University. But, he said, we’re not there yet. Hammers and drills are tools, not teammates, and current technology could be described the same way, Scheutz said.

He said we expect if a machine can pick up a square, it can pick up another shape but that that’s simply not the case.

“The kind of quick inference we expect of people, we will expect of machines,” Scheutz said.

People want robots to be adaptive and better than they are, Badger said.

“My (son) was 9 months old, and he got better at grasping things than my robot did at the time. My robot should be better because he’s 8 years old,” she said, laughing.

Video

All 2018 AIAA SciTech Forum Videos