Vigilant Aerospace’s CEO recently made a 30-minute presentation on testing onboard automatic detect-and-avoid systems for unmanned aircraft at the 2022 SAE AeroTech Conference in Pasadena, California.
Titled “Developing and Testing Fully Onboard Automatic Detect-and-Avoid for Unmanned Aircraft Systems and Advanced Air Mobility,” the presentation covered a variety of topics:
- Introduction: What is Detect-and-Avoid?
- What is detect-and-avoid and why is it important to the UAS industry?
- Why is detect-and-avoid critical for safety and the future of the industry?
- Background on Vigilant Aerospace & FlightHorizon
- Introduction to the company
- Discussion of the detect-and-avoid system in development and testing, FlightHorizon
- Typical DAA Process & Cycle
- How do DAA systems typically work?
- What is a DAA cycle?
- The steps taken by a DAA system take while functioning
- Background on DAA Technology
- Diagram of a DAA system
- Pictures of DAA systems in the field
- User interface of a DAA system
- Components of a DAA system
- Typical sensing equipment
- Industry technical standards
- Prior Field Testing and Use
- How to test a DAA system
- Sensor testing
- Overall system testing
- New Field Testing
- Testing with onboard and ground-based DAA
- Encounters with crewed general aviation aircraft and helicopters
- Beyond Visual Line-of-Sight Testing
- Projects and programs
- Emerging Trends
- Automatic DAA advantages
- New FAA rulemaking for BVLOS
- New technologies, platforms
- AAM & UAM industry developments
- Roadmap Forward & Next Steps
- New regulations forthcoming
- Moving from AAM to UAM
- Emergence of UTM
- Smaller radars
- Better connectivity
About SAE AeroTech
SAE Aerotech conference is an annual event which bring together industry executives, academics, regulators, engineers, technicians and the leading minds doing aerospace research and development to provide knowledge and experience sharing for the industry. Learn more here: Attend – AeroTech® (sae.org)
Video (26 minutes):
Developing and Testing Fully Onboard Automatic Detect-and-Avoid for Unmanned Aircraft Systems and Advanced Air Mobility
Transcript of the presentation:
I’m Kraettli Epperson, I’m the CEO of Vigilant Aerospace, and I’m going to be presenting about onboard detect-and-avoid systems for unmanned aircraft, focused on advanced air mobility, urban air mobility. I wanted to thank SAE AeroTech for the opportunity to present this today.
I’m going to try to leave a little bit of time for Q and A at the end, but I will be going pretty quickly, because we have a lot of material that I’m going to present.
So topics I’m going to cover, I’m going to focus on what is detect-and-avoid. So, a little bit of an introduction. Background on our company, and our product that we use, and the testing that I’m going to be discussing. Talking about typical detect-and-avoid processes and cycle, and why that matters, and how it works. Background on the technology, and the prior field testing, and some of the new field testing that we’ve been doing, which will really be the heart of the presentation. Then we’ll worry about some emerging trends, and then our future roadmap for development of this technology. So, a little bit of an introduction.
Introduction: What is Detect-and-Avoid?
Detect-and-avoid is a foundational part of autonomy for uncrewed aircraft systems expectant to be foundational technology, or Advanced Air Mobility (AAM) and Urban Air Mobility (UAM) in the future, in order to allow these aircraft to fly safely with a heightened area of assurance.
It allows uncrewed aircraft to detect other aircraft beyond visual line-of-sight (BVLOS). So that’s really the fundamental job. It is a requirement for safety. It is designed to help aircraft remain well-clear of manned aircraft.
It’s typically designed to detect both cooperative and non-cooperative. So, when we say cooperative aircraft, those are aircraft with transponders that are transmitting into an air traffic control system. Non-cooperative aircraft without a transponder that would otherwise be detected using some sort of ACAS sensor, but we’ll talk about that a little bit.
On-board detect-and-avoid (DAA), specifically, is very important for the future of the industry because it makes long-range, non-visual line-of-sight uncrewed flight practical without need to, for example, set up significant amounts of ground infrastructure, existing radars in just the places that you want to travel, or other infrastructure to allow you to remain safely aware and away from an aircraft. So on-board is what we feel is very important frontier.
It also needs to be scalable, on-board detect-and-avoid is highly scalable because it goes on the aircraft. It does not require a lot of preparation ahead of time and the systems must be trustworthy and well-tested. That’s a little part of what we’ll talk about today. These are critical safety systems, and it’s understood in the industry that there’s a great degree of testing and trust that is asked for these systems.
A little bit about how Vigilant Aerospace Systems. Our product is called FlightHorizon. We’re focused on developing detect-and-avoid and aerospace management systems, or aerospace management for ground-based, and detect-and-avoid with a ground-based and onboard.
FlightHorizon is based on two NASA patents that we’ve licensed, and has active use now at NASA. We’ve also used it in FAA projects, in the UAS test sites, which are spread across the United States in their development programs.
We’re focused on highly-integrated multi-sensor systems. So, as I talk a little bit more about what we’ve been testing, you’ll see some of those sensors. It uses both ground-based and on-board versions, and it uses the local sensors in increasing the online data that’s available.
We’re very interested in where our unmanned aircraft traffic management (UTM) development and collaborative systems that are emerging. We really focus on industry technical standards and current market needs as we develop our DAA system.
Typical Detect-and-Avoid Process
This is a diagram of our system. The system is designed to allow unmanned aircraft to detect-and-avoid other aircraft beyond visual line-of-sight. It is the software, the middle of that orange rectangle is the core of the system; that’s really FlightHorizon. It takes in information from multiple other systems and sensors.
We don’t develop our hardware, we are focused on the software, and we buy the hardware off the shelf. We partner with companies who develop the various sensors and transponders.
On the left-hand side of the diagram area you’ll see an autopilot, that is providing telemetry. We’ll pick that information up either directly from the autopilot or from ground control station, if there is a ground component. I’ll show you some pictures and diagrams of the system onboard and on the graph.
In the middle there is a portable radar, we integrate the small portable radars like this one, which is used for field testing in larger radars.
Then on the right-hand side there you’ll see a receiver for transponder signals from larger aircraft, any general aviation (GA) commercial aircraft. Most of them are required to have transponders, and so that’s a baseline technology we can use to provide awareness.
All of that comes into our system is updated live to provide airspace picture image, calculations of trajectory predictions, and then, ultimately, resolution advisories, or maneuvers for avoidance provided to the pilot or to the autopilot.
The Detect-and-Avoid Cycle
As we develop the system, we’re very much focused on autonomy. So, that all these things can come together in the software and provide what’s needed either to the pilot or the autopilot.
A typical DAA cycle will be to collect the information from the sensors, parse the data, index the targets, build the trajectories and predict the potential collisions, iterate strategies to issue an avoidance command, and then to monitor and maintain that self-separation. This is going on multiple times a second in our software, that’s why I put that up there, just to give you an idea of what it’s doing, and it’s doing that continuously.
The Technology: Basic Components and DAA System Function
Typical components… We often get questions about this. So, basic components would be the autopilot itself for the control and telemetry information, and then non-cooperative aircraft sensors. These might be a radar, like the ones that I’ve shown here, might be an EO/IR, an electro-optical infrared camera system. Those are also used for shorter range detection in some cases. Then acoustic systems and RF systems. RF is mostly used for counter UAS detection, when you’re detecting rogue drones. But all of those are used as non-cooperative sensors.
Cooperative sensors typically are going to be your transponders, on-boards that you can detect a cooperative aircraft that is participating in air traffic control. Then increasingly, of course, we’re very interested in integrating UTM as more information is being broadcast into a network system. Remote ID, I don’t include here yet, but we have a partner in projects this year that we’re doing several interviews to integrate Remote ID, which is going to be kind of important for UAS over the next year.
Testing Detect-and-Avoid Systems
Here’s some pictures of us setting up to do some field testing. On the left-hand side this is a typical small scale ground-based test. We have the ground control station for the uncrewed aircraft on the left-hand side, a telemetry radio, and they do have our software, FlightHorizon COMMANDER, in the center on the right-hand side there you have the ADS-B in. So, that’s your transponder receiver, and you got the portable radar. Those three data sources are informing the software so it can perform its job continuously, and provide that live picture.
On the right-hand side of the screen here you can see a picture of the same set up, but fully on-board. This is work that I’ll describe in a few moments.
This was a live beyond visual line-of-sight flight that we did, and this has the ADS-B and receiver, it has a connection to the autopilot on-board, has on-board radar, and has our software running on a single core computer that is integrated with all those inputs, and continuously performing its function on-board.
DAA System User Interface & Function
This is a little bit of information about the user interface that we typically use, and gives you an idea of what the software’s really doing. I’ll show you a quick video in a moment.
So, airspace visualization is being provided, so you can see here in the middle there is a small white arrow, that is your ownship. It is surrounded by two hockey pucks, or cylinder shapes. The yellow one is your well-clear distance, this is the safety distance from other air traffic into that. If you do have air traffic that is threatening to get into that, then the system is going to issue a ban to avoid that encounter. The red cylinder in the middle there is your NMAC, is your near mid-air collision sonar, you definitely don’t want aircraft in there.
On the left-hand side you have a detected aircraft that is coming in, there’s a blue line, that’s the projected trajectory of that aircraft. Then the system is issuing a warning, and issuing a resolution advisory, that’s the red box at the top there. That will do a countdown through the entire encounter. Will also be tracking for other aircraft, and be able to provide that command. It’s also producing waypoints that can be sent to the autopilot. So this is just a representation of a fully autonomous process that can send these waypoints for avoidance to the autopilot.
This is a video [view at 9:33] I’ll play for you very shortly. So you’ll see the white aircraft is engaging in an encounter with an intruder aircraft that’s been detected. The system is learning on that, providing a resolution to turn left and speed up immediately in order to avoid that intruder aircraft coming into that yellow zone.
That maneuver is then executed, and the encounter is avoided. There’s no loss of well-clear, and the system is able to continue monitoring and the UAS returns to its original flight path. That’s an example of a full encounter in just a few seconds. Gives you an idea of how the system works.
Alternative User Interfaces: TCAS/ACAS & UTM Multi-Ownship DAA
These are some other interfaces. These are alternatives that should be used with our system. The one on the right-hand side is for multi-ownships. So, you can add any number of uncrewed aircraft that you’re tracking in this system, and be able to do collision avoidance for all of them. Then on the left-hand side is more of a tradition TCAS/ACAS system. This is something that we’ve been asked for. Many pilots are familiar and comfortable with this interface. So, we’ve implemented this, and we’ll continue adding features to that.
Industry Technical Standards
So, very briefly I’ll mention the industry technical standards. These are standards that we track, these are developing all the time, and there are new ones emerging. Obviously one of the reasons that we’re very interested in coming to this event is because we’re very interested in standards that SAE is developing.
ASTM has developed several relevant standards that have been published now, and are working on several additional standards. The F3442 standard is the detect-and-avoid standard version 1.0, that’s published in 2020. I served on the working group, and now serving on the version 2.0 of that working group. There are also other standards for assurance of software credibility and interaction with ground-based radars through a UTM system. Then also standards that are emerging around how you test your system to determine that it meets these standards. So I’m also on a working group for that, and that’s very important. It’s been, obviously, very useful to us to understand the types of imports and artifacts we’ll be generating over time to demonstrate that we can fly with these standards.
RTCA has standards. Typically ASTM is focused on smaller UAS. RTCA tends to be focused on traditional aircraft and larger UAS. The DO-365B standard is the primary standard that we track, it’s the detect-and-avoid systems standard. The algorithms that we use, for example, are compliant with that standard. They’re actually in reference… The algorithms that we use are the reference installation for that standard. So that gives us great compatibility. It provides assurance, not only to our users, that we have implemented a system that they can use with the right configuration and the right equipment to meet these kinds of standards. Several other standards that we’re supporting, depending on what equipment is available, including sensors that we’ve implemented to use with systems like ours.
How do we test DAA systems?
When we test detect-and-avoid systems, how do we do that? What are we testing?
There are really two types of test that we typically engage in. We look at sensor testing, and then we look at system testing overall. Sensor testing involves a variety of radars, EO/IR systems, and acoustics detection. This gets its test all across the industry. We typically focus on two to three things, which I’ll show you. I’ve shown you these diagrams.
Typically, you’re looking for range of first detection, so when you first are able to identify an intruder aircraft that might present a threat to your aircraft, and your ability to create a track out of those detects. Is the detection consistent enough that you can create a track and generate a resolution advisory if needed? So those are things that we watch very closely.
If you log all of our data… This is actually a picture from some of the testing that we did under an FAA contract with lots of different systems being logged. Then we look at track filtering and classifications, this is a very important function in order to determine that you’re not tracking a bird, that you’re tracking the things you’re supposed to be tracking, and figure out what to do about them quickly.
Then system testing. So, as a system integrator, we really focus on how the overall system works. That’s absolutely critical for being able to meet the industry standards. That 3442 standard has a risk ratio as a target for that, and it also has a timing budget that you’ve got to be able to reach so that you can provide safety adequately with your system. So we look at the traffic alerting process, the warning process. We look at whether the algorithms are doing what they’re supposed to do.
We look at track correlation. Track correlation is the process of being able to detect an aircraft with multiple systems with sensors, and tell that it’s the same aircraft, and only track and display and use for warnings for one of those two tracks. So very important, otherwise you’ll be continuously having false alerts on your own aircraft, for example, and you have telemetry coming in and the radar detection. You have to be able to ignore the radar detection so long as you have used your algorithms to determine that’s your ownship.
We look at the resolution advisories to make sure that they’re working correctly, and then the ability to carry out the avoidance maneuvers. Our software, for example, allows you to characterize your aircraft. So you have the performance of your aircraft. If you have a minimum speed, a stall speed, if you have an ascend or descend rate, maximum speed, minimum speed, all of those things go into the software so that it can design a resolution advisory that your aircraft can carry out. It’s not very useful if you have a resolution advisory that you can’t do.
Recent Testing: Beyond Visual Line-of-Sight Flights – Trans-Alaska Pipeline
This is some information on some recent tests that we’ve done. This was a beyond visual line-of-sight flight test under a Part 107 wave groups, so this is fully beyond visual line-of-sight under Part 107.
This was carried out as part of a FAA research contract, it was done along the Trans-Alaska Pipeline with ACUASI, at University Alaska Fairbanks. They’re a great partner for us, and many other developers of these types of systems, and they’re one of the federally recognized test sites. So this was an FAA sponsored project where we were flying our FlightHorizon PILOT, which is our on-board system over multiple miles with no visual observers. You have a variety of sensors in place, including radars.
We were testing multiple predicate radars for this project and for the FAA, and for integration to our system. So this was the first flight beyond visual line-of-sight of a small UAS under Part 107 with an on-board radar with a detect-and-avoid system. So we’re really proud of that. We did that in the middle of last year, and that was a fantastic project for us. We provided reporting to the FAA and to the test site program on that project. This is a little bit more about that:
We were testing encounters against helicopters and against GA aircraft. This is the Cessna 172 aircraft with the on-board system and the ground-based systems. We had three different radars we were testing, and we were tracking and logging all of this, including the flight data reporters, UAS telemetry, radar logs, ADS-B in logs, and all the conditions. It was very cold in Alaska, which presents an additional special challenges for us.
This is a little bit of the technical information that we collected.
We did analysis on the radar tracks versus actual tracking of the aircraft, and it also said the range of first detection of the GA aircraft and the UAS in some cases. So in some cases, we were actually using a small UAS as a test aircraft in addition to the GA aircraft. We spent a lot of time on filtering and filtering effectiveness. This is something that’s very important for eliminating clutter and removing false tracks. Then overall analysis of the system effectiveness with that filtering in place. We did a lot of analysis afterwards comparing these various data sets to each other, to be able to provide this reporting to the FAA. Ultimately, hopefully, be able to impact the types of rules and regulations that will become available for industry to use as these types of systems are in the marketplace. Ours is in the marketplace now.
Recent Testing: Detect-and-Avoid Trials at MAAP
This is another set of testing we did recently in 2021. This was with the Mid-Atlantic Aviation Partnership (MAAP) at Virginia Tech, another one of the federal UAS test sites.
There were multiple companies testing detect-and-avoid systems at the MAAP farm. We had 80 small general aviation encounters over a couple of days.
We were using a radar mask. We were using ADS-B in. We were using direct connections, and in fact, in that picture we’re sitting right next to the pilot. So they can provide direct feedback on our resolution advisories. We have off-sets at various times, so this testing is done in a way that is very safe, it’s very controlled, so we know where the aircraft and the unmanned aircraft are at all times. We tracked and marked all of this, as we always do, in order to provide that information in real-time to the evaluators. We were very pleased with these tests.
Other Projects and Programs
More about some of our other work, we’ve been involved in the Alaska UAS test site and their BEYOND Program, which is the new program. It is the successor to the IPP Program, the Integration Pilot Program, that we had from 2018 to 2020.
We also have a system that we’re installing with Northern Plains UAS test site. This is a third of these federally recognized test sites, and that’s us handing a system over and getting ready to start using it with that.
We’ve been involved in many NASA programs. NASA uses our system particularly with their Supersonic program. They do a lot of their flight tracking in our software, because they log it all, and then they correlate our logs with all the other supersonic data they’ve collected about sonic booms and other things. It also provides safety for them, so they can make sure nobody’s violating their TFR while they’re flying very fast. So that’s been a great experience for us.
Advantages of Automatic Detect-and-Avoid for Safety & Autonomy
Some information about the advantages of automatic detect-and-avoid for safety and autonomy, and that’s really, obviously, one of the focuses for this event, this autonomy in AAM.
So a system like ours provides integration to the autopilot telemetry and lots of logging, so it really provides better safety in a system that is not fully integrated. It’s aware of its own location, and tracks distances to other aircraft, it predicts trajectory. So it can provide a high degree of autonomy.
It automatically calculates these resolution advisories to maintain separation. So it’s not pinging on one screen and somebody grabs a joystick, or otherwise take some action based on what they perceive to be going on on one of these screens, which is how a lot of detect-and-avoid has been accomplished over the years.
This system is fully integrated and closed so that it is receiving all the information and can send the exact resolution advisory out. In the field it’s very important. It’s really a baseline for autonomy, to be able to move autonomy forward is you have to have this automatic safety. It continues to recalculate multiple times a second to cut down on that delay or latency that you might have.
So it takes a deterministic approach to maneuvers. It’s using maneuvers that are a part of the existing standards. It uses those tables. So, you know what it’s going to do. You know that it’s going to take a maneuver that is essentially approved.
It follows industry standards. It uses those algorithms. All of these standards in all DAA systems, ultimately what they’re trying to do is reduce the closest point of approach in that intruder aircraft after it’s been detected, and determine that it might be some sort of threat, that it might come into your well clear volume. So, that’s ultimately what it’s doing, and that’s why these systems can be proven to be effective; because, you can measure that CPA, and that’s a typical method that’s used in this type of testing.
We really believe that these types of systems need to become a baseline function in autopilot systems, and really to enable autonomy, and enable the next stage of AAM in the US that these types of systems are critical for that to happen.
Important Emerging Trends
Some important emerging trends in the industry:
There are regulatory changes that are coming down to be very exciting. I recently served on the FAA’s beyond visual line-of-sight Aviation Rulemaking Committee (ARC). That report has just been published a couple weeks ago. That’s recommendations for new rules for beyond visual line-of-sight of an uncrewed aircraft, specifically focused on low population-density areas. But it has a lot of trade space, a lot of different ways that you can use those recommendations when they become rules.
We think that’s going to be fantastic for the industry, and we certainly expect that it would be good to influence the issuance of waivers and other sorts of things that the FAA’s able and willing to do now that it’s been published.
There’s also, obviously, a lot of emerging excitement in the market between UAM and AAM flights. We believe that the interest is very high right now in the market, because of the [FAA] ARC because of other developments, just the maturity of the technology. So that’s really creating opportunities for this type of technology.
There is a lot of critical development, testing, prototyping that’s going on across the industry, a lot of that’s been highlighted at this conference. Then there’s a lot of ML/AI opportunities, particularly around vision systems. To be able to add additional layers of safety. All of this is multi-layered safety, multiple systems and detections. So machine learning and AI will certainly be able to contribute to that.
All right, finally, a little bit about what we see as the roadmap forward for the industry. So regulatory experience helps to… you know, controls the speed at which these things are developed.
We know that these initial beyond visual line-of-sight recommendations, moving to rules, will be a very important waypoint for AAM. We have seen this before, the FAA will implement one set of rules with the expectation that they will inform the next set of rulemaking. So we are very excited that we are now in that process, with the recommendations going to the FAA, rulemaking will then take place, and then the FAA will be able to begin to watch lots of beyond visual line-of-sight flights and what level of safety can be achieved; what technology can help do that. To then begin to inform the next set of rules to make AAM.
Then, ultimately, UAM for higher-density population-density areas, and much more advanced sophisticated flights, moving into eventually air taxis and things like that.
We’re also excited about and see a lot of promise with the collaborative approaches to air traffic management, like UTM. That, we believe, is going to emerge as a very important part of UAM particularly.
There’s also a lot of changing technology. This is something we have to track, because we’re trying to use and implement this technology all the time. So better, smaller, lower-powered sensors, including radars. Radar on a chip is something that is being talked about in the industry. Better, faster, and more reliable machine learning based classification, more computing at the edge, to allow you to take large amounts with very sensitive, discreet data, and then push forward with what’s required and necessary. More powerful on-board computing, and the ubiquitous 5G connectivity is very important.
Then finally, more of these collaborative technologies as they emerge and become a part of what can be used to provide additional data.
That is my presentation. Thank you all very much for listening.
Let’s continue the discussion! Contact us to discuss your detect-and-avoid operational needs today!