AI is starting to enter almost every aspect of our lives, automating all the tasks we don’t want to do. When it comes to transportation, driverless cars have been around for a long time and continue to improve; but what about going one step further – helping us fly? One company – Daedalean – is on a mission to do just that.
In February, Daedalean, in partnership with Intel, PUBLISHED (Opens in a new tab) one of a series of upcoming white papers detailing its certified AI system, and how the technology could advance to lead to fully autonomous flight across the Aerospace and Defense (A&D) industry in the future.
But to focus on its current technology, TechRadar Pro spoke with the company’s founder, CEO & CTO Dr. Luuk van Dijk, to find out more about all-seeing AI and its aim to be the first to be certified for flight control applications in civil aviation. .
Plane spotting
Daedalean’s inaugural AI product is called PilotEye, and is what Dr. van Dijk a “visual situational awareness suit”, which uses hardware manufactured by the avionics firm. Avidyne (Opens in a new tab).
Currently it can detect airborne traffic, but the full version, when released – known today as VXS (“visual everything suit”) will be able to detect all kinds of airborne dangers – ” including birds and drones” – and will provide ” GNSS-independent navigation and positioning, and landing guidance for helicopters, fixed-wing aircraft, and eVTOL. As Dr. van Dijk describes, “VXS is a AI copilot working as pilot assistant.”
VXS is equipped with several multidirectional cameras and a unit that calculates information using Daedalean algorithms:
“The software is based on computer vision and machine learning (“AI”) and works on the plane without connection to any ground infrastructure. The algorithms analyze the video feed from each camera, [and] recognize and interpret what they “see”.”
These include: “the landscape below to compare it with maps and provide pilot coordinates and altitude… ; the traffic in the environment – what intruders are flying where and what is the distance to them; the runways or landing spots below, and the directions for a safe landing.”
Currently, VXS can only work effectively under optimal viewing conditions, with Dr. van Dijk that “while it is able to work in moderately bad weather or reduced sunlight, it is still not the most powerful product.”
However, Dr. van Dijk says that future developments “will enable the suit to work in any conditions, including complete loss of visibility.” This will be achieved by “adding night vision sensors, radar, and other sources of information.”
By parsing the input from the VXS camera and other incoming sensors, the AI creates a “situational map of the environment”. This is what sets it apart from other avionic systems. The neural network analyzes these inputs and “gives answers to a question specified for it: for example, is there something interfering with the image, such as an airplane or a drone; or is there a runway below suitable for landing? “
Its job is to “process complex information with high uncertainty, such as an image with a landscape, objects on it, sky, clouds, and objects in the sky – and explain whether what is there, what things, where, where are the distances, etc.”
But what exactly is the advantage of having an AI copilot over a human? Dr. van Dijk explained:
“Adding it to the cockpit will increase flight safety: the tested performance results show that it is better and better than human vision.” Unlike a human, it can scan the entire sky at once, and can “identify a Cessna at a distance of 3 nautical miles when it is just a dot in the sky.”
Dr. also stated. van Dijk the point that is usually made in favor of automated systems in general: “unlike a person, it is never tired or distracted. Its purpose is to allow a pilot to concentrate on their mission.
TRAINING
Like all AI systems, the algorithm must undergo training. Narrated by Dr. van Dijk the process by which the complex statistical models used in AI are developed:
“This is done by… analyzing millions of similar images that have been annotated by people. Humans have a seemingly magical ability to figure out the answer to the question “is there a plane in this image” just by looking at it – but a powerful statistical model can capture the patterns of statistics from their decisions, finding millions of parameters that distinguish the images in which a person. answered “yes” from those who had “no”. This process is called machine learning. ”
Dr. detailed. van Dijk the many layers involved in this training process, to ensure that the AI is sufficiently equipped to identify traffic with a high degree of accuracy.
The first step as mentioned above – analysis of millions of images – is given special attention, with two teams – one in Riga and one in Zurich – of “specially trained annotators of data” assigned to the task. Dr. van Dijk says that Daedalean “invested a lot in this because the process of how they work with the data is also subject to certification requirements.”
These annotated images are then used as training data fed to the neural network that “analyzes each pixel in them and finds statistical dependencies between the states of each of the pixels”, in order find out if a plane is there or not and its exact location. location within the image.
“There are millions (again, literally) of parameters it counts in statistical equations; so no one can monitor it and influence what it does. This is the task of another special computer program (training algorithm). It fiddles with the parameters asking the neural network over and over again and compares its answers to the answers humans make (in the form of annotated images). As long as the NN answers can be reliably similar to the known answers. “
The next phase is testing, where the newly trained algorithm is fed a large number of images it has never seen before – also annotated by humans – and tasked again with making recognition decisions.
Once it passes this stage, it will be installed on a “data collection planes or drones (if we are testing lighter versions of the hardware).”
“Usually we have two planes flying – one with the VXS on board and the second to act as a target. They play in different scenarios, meeting from different directions, at different altitudes, directions, speeds. Recorded in the system everything seen by the cameras during the flight.
From here, the AI is assessed after its flight: “we have special algorithms AND people who analyze it second by second and image by image. We analyze where the neural network went wrong and why and use this knowledge for the next cycle.
And so, “after several cycles of lab testing, development, real flight testing, development again – the entire application was frozen and released.” What or when that release may depend on certification by the appropriate authorities.
Cleared for takeoff
Daedalean is looking to obtain its AI certification by the FAA (Federal Aviation Administration) and EASA (European Union Aviation Safety Agency): “We are working with European and American regulators simultaneously to obtain a certificate that includes a choice of aircraft models in 2023 .” If this is achieved, then it will be a world first for AI.
But Daedalean does more than try to get its own product off the ground. It is an active participant in conducting research and reports on the use of AI and ML in aviation, working with the two aforementioned authorities to ensure the safe use of AI in the industry.
“EASA and Daedalean have collaborated and published two parallel reports on Design Assurance Concepts for Neural Networks (CoDANN) (2020 (Opens in a new tab), 2021 (Opens in a new tab)). The reports discuss how classical software design assurance can be adapted for ML in safety-critical situations. The results of this research partly led to EASA’s first guidance for Level I AI/ML in aviation.
And then, in 2021, “the FAA, in collaboration with Daedalean, will study applying CoDANN’s findings to a real application. This project resulted in a report published by the FAA in 2022 (Opens in a new tab).”
Although there is a lot to go through to become certified, Dr. van Dijk is optimistic about the future, confident that growth will come:
“Based on the demand we see now for our Eval Kit (the demonstrator we offer to selected customers with permission to install experimental equipment), we expect a hundred annual installations every year in the first and second year (this is an important number for the General Aviation industry), and an exponential growth after that, as the air taxis get their operating permits and come to the market in thousands scale.