Autonomy in Aviation: Designing Safer Systems with AI/ML and Simulation

Along with electrification, autonomy is a major trend in various industries ranging from manufacturing and industrial equipment to automotive and aviation. In fact, the global autonomous aircraft market was estimated at $4.56 billion in 2019 and projected to exceed $16 billion by 2027. Accordingly, advanced air mobility (AAM) companies are developing autonomous aircraft designs to move people and cargo between places more effectively. Though the terms are often used interchangeably, urban air mobility (UAM) and regional air mobility (RAM) are subsets of AAM, which focus on air transport at lower altitudes in urban and suburban areas, respectively.

Unsurprisingly, building safe, autonomous AAM systems requires complex training, engineering, development, and design. Artificial intelligence/machine learning (AI/ML) lends significant assistance to these areas by helping engineers and designers develop critical perception and decision-making functions, which are fundamental to autonomy. However, challenges — and concerns — arise around AI/ML’s inability to provide realistic representative situations in the training and validation of these autonomous functions.

Simulation offers incredible value in helping to build confidence in autonomous AAM systems, ensure their reliability, and validate their safety. In early stages, simulation provides critical insight, predictive accuracy, and thorough analyses to inform training and development. In later stages, simulation provides realistic environments and scenarios to validate and test these functions. By integrating Ansys’ solutions, AAM companies can adopt a seamless, end-to-end workflow to optimize training and validation using simulation and digital mission engineering tools for safety analysis, embedded software, sensor testing, and more.

Sharpening Perception and Decision-making in Autonomy

There are both new and classical applications of autonomy in aviation. New applications center around next-generation AAM transport, which consists of vehicles that are typically small, highly automated, and carry passengers or cargo at lower altitudes. Generally, these systems, particularly UAM systems, rely on technologies such as helicopters or emerging technologies such as electric vertical takeoff and landing (eVTOL).

In contrast, classical applications are implemented in existing systems. For example, commercial aircraft manufacturers might incorporate autonomy to increase situational awareness for pilots, alleviate pilots’ responsibilities and workload, or optimize the efficiency of various flight phases. Similarly, military aircraft providers may consider autonomy to assist pilots in handling unexpected changes during a mission, such as new targets or degraded conditions.

Typically, an autonomy application must comprise three main capabilities, which influence each other as follows:

  • Perception: to observe the environment, including oncoming obstacles (for example, other aircraft, weather-related challenges, or other hindrances to the flight path). This is most commonly achieved through sensors affixed to cameras, lidars, or radars.
  • Decision-making: to determine the best and safest flight maneuvers based on the perception and detection of such obstacles.
  • Actuation: to perform the desired flight maneuvers established above.

In effect, an autonomous system needs to establish reliable perception and decision-making capabilities before it can successfully execute actuation. Simulation brings significant value to both of these areas. For perception training, physics-based simulation provides raw sensor data and ground truth information, which eliminates the need for complex image processing, reduces training time, and increases accuracy. For decision-making training, simulation offers sensitivity, robustness, and reliability analyses, which help to strengthen flight performance, the safety of flight maneuvers, and collision avoidance.

Sensitivity analysis for eVTOL
Through sensitivity analyses in Ansys optiSLang, an oscillation issue was identified (left) and addressed to achieve smoother flight (right).

 

 

Simulation also improves reinforcement learning (RL), which is an AI/ML training technique that enables a model to learn on its own by trial and error. In other words, unlike supervised learning or non-supervised learning, RL enables the AI/ML agent to learn interactively through feedback from its environment, including its own actions and experiences within that environment. For this reason, simulation greatly supports RL training by providing an opportunity to create diverse and near-countless simulated environments, which in turn improves the quality of perception and decision-making training.

Autonomous Applications in Use

Ansys provides a complete model-based systems engineering (MBSE) workflow to assist in the training and validation of autonomous functions, including simulation, system architecture, sensor testing, safety assessment, and operational design domain (ODD), as well as scenario creation, variation, and results analytics.

First, let’s get to know the key tools used in this workflow:

  • Ansys medini analyze: a model-based, integrated tool that supports safety analysis for electrical, electronic, and software-controlled systems. It allows for the consistent and efficient application of industry guidelines specifically tailored to industry standards such as ISO 26262, IEC 61508, ARP 4761, ISO 21448, and MIL-STD-882E.
  • Ansys optiSLang: a process integration and design optimization tool that solves challenges posed by computer-aided engineering (CAE)-based robust design optimization (RDO).
  • Ansys Systems Tool Kit (STK): a system-of-systems simulator that enables you to model complex systems inside a realistic and time-dynamic 3D simulation, including high-resolution terrain, imagery, radio frequency (RF) environments, and more.
  • Ansys AVxcelerate Sensors: sensor testing and validation that enables you to use realistic scenarios to investigate radar, lidar, and camera sensors perception in a model-in-the-loop (MIL), software-in-the-loop (SIL), or human-in-the-loop (HIL) context.
  • Ansys SCADE Suite: a model-based development environment for reliable embedded software, which provides linkage to requirements, management, model-based design, verification, certified code generation capabilities, and more.

 

 

Now, let’s explore a sample implementation of this workflow in six steps:

  1. In medini analyze, define the system architecture, environment, ODD (the conditions in which the function will operate, including weather-related phenomena), and a set of functional scenarios to traverse the ODD.
  2. Using optiSLang, define functional scenarios into logical (parameterized) scenarios.
  3. Alter logical scenarios into concrete scenarios using a design of experiments (DOE), in which the parameter values follow a probability distribution corresponding to reality.
  4. Use concrete scenarios to train the autonomy function based on a combination of simulations running in STK, modeling the entire system of systems in action; AVxcelerate Sensors to test and validate sensors; SCADE for critical embedded software; and/or an external AI/ML training tool such as YOLO or OpenAI.
  5. Returning to optiSLang, assess the resulting neural network through sensitivity and robustness analysis.
  6. Lastly, integrate the full autonomy function (perception, decision-making, and actuation) and assess the simulation results using optiSLang for a reliability analysis based on adaptive sampling to efficiently explore the design space.

By integrating some — or all — portions of the sample Ansys workflow outlined above, engineers and designers in the aviation industry are developing and validating safer and more reliable autonomous systems.

 

 

Train and validate your autonomy function using system-of-systems simulation in Ansys STK.
In one case study, an aircraft manufacturer is integrating Ansys solutions to ensure collision avoidance of unmanned aerial vehicles (UAV). An automated eVTOL must fly to a waypoint while avoiding collision with obstacles. Already confident in the aircraft’s perception capabilities, this team is most concerned with the eVOTL’s decision-making skills to determine its best flight path.

In another example, an aviation unit is adopting a similar Ansys workflow to conduct formation flying. A fleet of four automated eVTOLs must fly in formation following a piloted eVTOL. This example is concerned with both perception (detecting the lead eVTOL and other vehicles) and decision-making (following the lead eVTOL while avoiding any collisions).

 

Get Ready for Next-Gen AAM

Ansys’ simulation solutions enable customers to safely train, test, and validate critical AAM applications, building confidence around AI/ML-assisted software and autonomy in embedded systems. Further, by combining Ansys’ high-fidelity simulation and digital mission engineering tools, customers can develop and validate these systems within a realistic and time-dynamic 3D environment.

×

Thank you for your message. It has been sent.
Please Check Your Email