Close
Do you have any questions? Contact us!
Blog

Dr. Luuk van Dijk, Daedalean: “the rise of autonomous solutions does not mean human pilots will be out of a job”

Originally published at
https://cybernews.com/security/dr-luuk-van-dijk-daedalean-the-growing-use-of-autonomous-solutions-does-not-mean-human-pilots-will-be-out-of-a-job/

We are getting more relaxed with ideas, like self-flying planes. Rising levels of automation in modern planes have already become the new normal.

Autopilot is usually activated a few minutes after takeoff and before landing. But this isn’t always the case. Some small passenger airplanes equipped with solutions like Garmin Autoland can even land themselves in an emergency situation if the pilot is unable to control the aircraft.

Today, Cybernews investigators decided to shine a light on the often-invisible autonomous innovations in the aerospace industry. Therefore, they contacted Dr. Luuk van Dijk, CEO & Founder of Daedalean – a startup developing flight control software for autonomous flight, with the eventual goal to create an AI pilot that will measurably outperform human pilots in all their functions.

How did Daedalean come about? What has the journey been like since your launch in 2016?

We started in 2016 when we saw the first eVTOL projects becoming public, and we realized that for these things to be successful, they would have to be fully autonomous in the long term. Since then, we've realized that there's already a lot of headroom in aircraft that fly today. When you hear that commercial air traffic is safe, what you don't hear is that in small aircraft, the incident and accident rate is 200 times as high.

For that, we realized early on, that there's no choice but to get to full autonomy because there's already a shortage of pilots. So we set out to develop a fully autonomous flight control system that clearly outperforms humans on every measurable dimension, and can fly in the system of rules and regulations for civil aviation that has evolved over the past 120 years without having to completely change all the rules for certification.

So, we took the commercial pilot license skill test for helicopters, which is a small booklet published by the FAA. It describes what it is that humans actually are supposed to prove they can do well, in their check-ride. This provides a nice roadmap of what it is that humans actually do in flight. A pilot with his or her eyes open and hands on the control can fly legally and effectively an aircraft from A to B and safely by using his or her eyes to see where you are, where you can fly, where there are other people flying that you shouldn't fly into, and where you can land (which is arguably the most critical part of flying). And we build systems that can act just like this – use a camera to look out to the outside and recognize where we are without relying on GPS, see other traffic, find a suitable runway or helipad, and land there.

Can you introduce us to the Eval Kit? What are its key features?

We offer it to aircraft and helicopter operators, for assessing the performance of computer vision technologies in a real-world environment. It’s a prototype product that allows them to explore the capabilities of computer vision technologies as a source of redundant flight information. They get the set of equipment – cameras, the computational box, a tablet for the human interface – to install it onboard a flight test vehicle and watch the work of the system in real-time during a piloted flight.

Nothing provides a better understanding and feel of how the system actually works than the opportunity to watch it right as it is happening in flight and compare what you can see with your own eyes and through the camera view. You witness the system detecting and tracking intruders around you that you can hardly spot yourself. You observe your trajectory accurately displayed on a moving map without any GPS inputs. You absorb all the necessary inputs required for landing on either a runway or helipad based on just camera feeds. And of course, Eval Kit records raw and benchmark data for extensive post-flight analysis.

By the time the actual product is certified and launched to the market, the early adopters who worked with Eval Kit will already have a long history of testing it, trying to break it in all possible ways, assessing the risks, and marrying it with their flight control equipment – and thus will have strong arguments for getting all necessary operational approvals and flight clearances.

Where can we expect to see autonomous solutions used more often in the next few years?

If we are talking about air transportation, the autonomous solutions are addressing the current demand for urban and regional passenger and cargo transport. When Urban Air Mobility overcomes all the challenges it is facing (technological, regulatory, public acceptance, air traffic management, physical infrastructure, and others), it is expected to become a hundred-billion-dollar market. Three problems for this market that autonomy addresses are safety, economics, and capacity.

Safety: The FAA cites ‘loss of control in-flight’ as the top cause of a small aircraft crash. Autonomy could be an excellent tool for assisting human pilots, supporting them, and helping them to detect potential collisions and land more safely.
Economics: If you remove the pilot, it’s not just getting rid of the expense of the pilot’s salary, but also your asset becomes available 24/7. As a result, the viability of the business case is immediately much higher.
Capacity: Currently, with human-human pilot-ATC communications over a voice channel, the system today can tolerate maybe 10 or 20 aircraft over a large city. Flying air taxis or smaller cargo transports at a higher density at an equal or better level of safety will be feasible only when autonomous.

How did the recent global events affect your field of work? Were there any new challenges you had to adapt to?

The global supply chain blockage causes some headaches, but nothing we haven't been able to cope with. Also, Switzerland has opened its doors to Ukrainian refugees and we try to do our part in providing meaningful employment.

Since the aviation industry is your main field of focus, what predictions do you have for the future of this sector?

The industry is actively discussing switching to single-pilot operations. My personal belief is that this will turn out to be harder, and less safe than going directly all the way to full autonomy on board. The technology to make that possible and the related rules do not exist yet, but they are being worked on. Give it a decade, and I think these systems will exist, along with the data to prove that they are safer, cheaper, and can fly at higher traffic densities. (If that data does not materialize, it would be industry-wide criminal negligence to force the change.)

However, that does not mean human pilots will be out of a job. With the increased volumes, higher safety, and increased economic incentives, it means that the job will change. With the decisions on the timescales of minutes and hours safely delegated to a machine, the human pilots can manage risk more effectively on larger scales – and perhaps gradually off-board. Like in many professions, the profession of a pilot will be a different job over time, but it will not cease to exist.

Since autonomous airplanes and vehicles are relatively new technology, there are still some concerns and misconceptions surrounding them. What myths do you notice most often?

Oh yes, there are a number of misconceptions in the world, in the industry at large, but also way beyond that, that modern artificial intelligence techniques are fundamentally un-understandable. “Nobody knows how they work”. “They are some black boxes, they do magic, therefore, they're fundamentally unsatisfiable”. And these are some of the phrases that are thrown around, you know, “nondeterminism, untraceable, unpredictable.” And these concerns are justified to raise concerns and they have to be addressed. Fortunately, many of them are misconceptions.
If you have an automatic system dealing with a real-world situation, there is uncertainty in the environment. As Sebastian Thrun wrote in his book about probabilistic robotics, “you have to deal with uncertainty.”

Currently, the way uncertainty is dealt with in avionics is by entirely outlawing it and passing it to the human. Anything that wasn't on the checklist goes to the human who then goes through a checklist. You have to deal with this. So we have no choice: if we want autonomy, we need systems that can deal with this uncertainty in the environment. And that non-determinism is often attributed to the system, but it's really a problem in the environment that you have to deal with anyway. Calling a machine-learned system non-deterministic misattributes the source of uncertainty which really lies in the environment from where the system gets its inputs.

In this age of ever-evolving technology, what do you think are the key security measures everyone should implement on their devices?

I don’t think there are simple answers, but for starters, aircraft should not be part of the Internet Of Things. The attack vectors are too many and too subtle. Flight computers should not have a live connection to the internet.

Share with us, what’s next for Daedalean?

We are working hard on receiving a supplemental type certificate (STC) for our first product. In collaboration with an avionics manufacturer Avidyne, we have been developing a product called PilotEye(™) – the first AI-based onboard pilot aid system for General Aviation.
PilotEye works as a pilot assistant for a piloted flight. It serves as an additional set of eyes in the cockpit which provides visual detection of airborne hazards, including non-cooperative traffic such as aircraft not equipped with ADS-B, drones, and birds.

Our joint target with Avidyne is getting STC approval from the FAA for a fully integrated system by the end of 2022. This will be a certification of the level of safety called DAL-C: it’s the middle of the certification Development Assurance Level scale, which goes from A to E, and DAL-C reflects a “Major” failure condition. And, as far as we know, this will be the first system with a non-trivial safety case that has a machine-learned component certified to DAL-C – of what EASA calls “Level 1 Artificial Intelligence ("assistance to human").”
Blog