Vigilant Aerospace Systems’ CEO Kraettli Epperson participated in an AUVSI presentation on “Enhancing Operations with Sensors and Other Instruments.” We are happy to share the video of the presentation and a full transcript, below. Please contact Vigilant Aerospace for more information on how we can enhance your flight operations with sensors, air traffic tracking and automatic warnings.
Enhancing Operations with Sensors and Other Instruments
Watch the video:
Highlights from Mr. Epperson’s presentation:
I will go pretty quickly through some of these slides so we can get to some questions. My agenda is to focus on the types of problems that we use sensors to solve at Vigilant Aerospace.
We are a company that develops safety systems for unmanned aircraft. That is our focus.
We build technology that’s based on a couple of licensed NASA patents that Ricardo [Arteaga], who works at NASA Armstrong, was the inventor of. We love working with NASA
We’re going to talk about the importance of sensors and safety for unmanned aircraft operations We’re going to talk specifically about a function called detect-and-avoid and why sensors are critical to that and why it’s critical to the industry. And, we’re going to talk about integration of unmanned aircraft into the US national airspace, which is a really exciting topic right now.
Let me talk about what detect-and-avoid systems usually have as components and talk about some examples including our particular products. We’ll talk about some recent projects that we’ve done to develop our detect-and-avoid system and to utilize and test a variety of sensors. Then we’ll talk about some future trends going on with these types of sensors.
To start with it’s important to understand the problem that we’re trying to solve with sensors. The industry right now for unmanned aircraft of all sizes is really focused on being able to fly what’s called beyond visual line-of-sight [BVLOS].
Generally, the rules, particularly for smaller commercial unmanned aircraft and really for all un unmanned aircraft, either require a lot of systems and mitigation or don’t allow flight of the aircraft beyond the line-of-sight of the pilot at all. This is a real bottleneck in the development of the industry and companies like ours.
The FAA and many industry manufacturers and developers are working on overcoming this to have safety systems that allow you to fly beyond visual line-of-sight. Unmanned aircraft are required to remain well clear of other aircraft at all times.
Even though there’s not a pilot on board you have to be able to use sensors to detect other aircraft and remain well clear. This is typically called detect-and-avoid. So, DAA systems are the types of systems that allow you to do this.
You are particularly worried about detecting what are called non-cooperative aircraft. These are aircraft that tend to be smaller and they do not have a transponder.
There’s a certain number of aircraft in the US at all times and worldwide that are not required to carry a transponder and may not. So, it’s important that your unmanned aircraft use sensors to avoid these.
In addition, onboard detect-and-avoid allows you to make very long-range flights as opposed to being limited to a particular area where you might have a ground-based sensor, for example.
So, there’s different types of sensors and different operations these systems because they are safety critical have to be trustworthy and well tested. They’re really critical not only to safety but to the advancement of the whole industry right now
So, how do we use sensors? In particular, the ones that I’m going to talk about today are direct integration of radar systems into a detect-and-avoid system. We use them to detect non-cooperative aircraft.
Our system is based on software that calculates trajectories the tracks of aircraft and then detects a potential conflict using the sensor data and helps you avoid that.
It is important that these sensors provide sufficient warning to be able to calculate and carry out the avoidance maneuver. So, when we evaluate sensors, that’s really, really critical.
These are some photos from some of the developmental work that went into this technology, particularly some of the working testing at NASA.
So, typical DAA system components include sensors so a radar usually either onboard or ground-based. Sometimes they include an EO/IR system. So, cameras and acoustic systems are sometimes used – especially to direct other sensors to an area that needs to be closely observed because there might be air traffic coming in. Then other sensors like lidars occasionally are used.
These systems usually involve a display and so human factors become important in how you communicate information to the remote pilot.
Autonomous processes are also emerging as very important. This is an area that we do a lot of work in and our system provides an autonomous voting avoidance process.
I’ll show you some screenshots that will give you a little bit of an idea of how that works, but that involves algorithms and, increasingly, AI and machine learning to help you with that process.
Then you may have access to other non-sensor data sources, like weather data, that are being sensed elsewhere by federal agencies, for example. To give you information that’s also important to the safety of your flight
So this is a little bit of information on our system I won’t cover it in great depth here, but I’ll just mention that it does use this information to detect, track and avoid conflicts with other aircraft. [FlightHorizon] provides a 2D or a 3D display of the airspace and the air traffic, and it fuses data from multiple sensors.
That’s very important because you have your own aircraft in the air you’ll have a variety of other aircraft that you’re detecting maybe in a couple of different ways. So being able to deconflict that in your display and your airspace model so you can understand what’s where and what is approaching is very, very important.
On the left-hand side you have your unmanned aircraft. On the right you have an aircraft that may be entering your airspace – a manned aircraft.
You will detect that with one of these sensors so a radar for example if it does have a transponder you might detect it that way. All that information goes into the orange box in the center and is used to create a model of the air traffic, so you know what’s going on and can maintain situational awareness. Then receive specific commands if you need to perform an avoidance maneuver.
Here’s a picture [below] to help you understand what the system is doing a little bit. So, in this case, in the middle here this white aircraft you have your own ship – your own unmanned aircraft. On the right-hand side in the center here you’ll see there’s a small detected blue aircraft. That is an aircraft that’s coming into your airspace and is presenting a potential conflict.
The software has used the sensors to detect that, calculate the conflict, calculate an avoidance maneuver, and then it’s telling you to avoid that. As you can see, these sensors are absolutely critical to the software being able to perform this function and allow you to have a safe flight and deconflict from other air traffic.
We have three versions of this. If you’re interested in this, you can reach me out to me afterwards.
We have a ground-based version called FlightHorizon COMMANDER. That’s intended for airspace management. We have our FlightHorizon PRO version, which would run on a laptop or a tablet for field use on the ground. And, then we have the FlightHorizon PILOT version, which is our onboard version.
You can see a picture of it here flying with a radar underneath on a small hexacopter that’s being used for some testing.
So, what are some considerations in the use and selection of these sensors?
The first thing, obviously, is these sensors have to be able to detect small, manned aircraft. That’s most of the traffic that you’re worried about when you’re flying an unmanned aircraft. Most of the larger aircraft will have a transponder particularly as you are flying at higher altitudes.
If you have a larger unmanned aircraft, you’ll be able to possibly utilize air traffic control, but in areas where you cannot do that, it’s absolutely critical that you have sensors to be able to provide detection of the air traffic.
It’s got to have sufficient range that it’s detecting the aircraft that you need to avoid far enough out that you can perform and maneuver the swap, as we call it. It is a very important factor of the sensor.
This is the size weight and power requirements, so that it can fit on the aircraft or otherwise be placed where you need it to be placed easily. It needs to be able to filter out clutter so it’s detecting aircraft and can differentiate non-aircraft targets. This is extremely important. We spend a lot of time being sure that we can do that with the sensors that we use. Then operating conditions are very important.
You’ll see some of the pictures I’m going to. show you from some of the testing and operations that we do are very in very cold environments. So, [operating conditions are] very important. Weather, so basic suitability issues of the sensor and temperature and moisture other things like that.
Unit price versus mission value. We’re very aware of this. You have to be able to use sensors that are suitable for the value of the types of commercial or other types of operations you’re carrying out.
We’re always working on affordability and trying to develop systems that are suitable for the particular operation.
Then the integration path. So, how the software talks to the sensor and the sensor talks to the software is very important in our process.
When we test sensors we’re looking at two things. We’re looking at the basic sensor performance. Does it have the range? Does it have the resolution to be able to perform the function that we need it to perform? Then we will use it in specific encounter testing in the field where we will have multiple either unmanned or manned aircraft that are encountering the unmanned aircraft in order to test and demonstrate that the system is able to do what it needs to do.
We will look at sensors based on either ground-based or on-board. Depending on what we’re doing, the radar will need to be able to filter out clutter. This is really a software function. Then reject false tracks that’s very important.
When we do this testing, we’re really establishing that baseline performance to be able to do this.
These are some photos from some actual recent flight testing that we have done with a variety of sensors. I’m going to flip to the next slide here where we’ve got some labels.
This was a project that was carried out with support from the FAA and was done with University of Alaska Fairbanks and the Alaska Center for Unmanned Aircraft Systems Integration [ACUASI].
This was a project you can read about it on our blog in which we are testing a variety of sensors. We’ve got the small radar that I showed in some prior slides here. This is an EchoGuard radar from Echodyne. We have a GA-9120 which is a larger longer-range radar which we’re also testing.
We’re capturing data and doing a lot of observations with these radars going into the software.
I’ll show you another couple of projects we’ve done here.
This is another picture of the radar and the detect-and-avoid system flying onboard so you have a single board computer here that has our software running. It’s accepting data from the radar. It’s also got a transponder receiver. It has direct connection to the telemetry including the GPS and other information from the aircraft.
The IMU, so the inertial measurement unit that is integrated into the radar, for example, is directly feeding the software so it knows where the radar is pointing which is very important when you’re flying around. We’re very excited about this system.
Typically, these are the types of aircraft that we’re flying against and testing against to be sure that we can detect them. Small GA aircraft, like small helicopters that are often used in the field at these altitudes.
This is a picture from a set of flight tests that we did with Oklahoma State University and the Unmanned Systems Research Institute there. This is very interesting.
We were able to fly these two small drones close to each other in order to do some very interesting testing. You can see some screenshots here of the radar and other systems tracking both ownship and the intruder aircraft, as we call it (the other aircraft) with the system providing some specific maneuvers that have to be carried out to perform the avoidance.
This is a very interesting project and we are doing ongoing work with OSU and others.
This this was some work to test the range of what’s called the GA-9120 radar, which is larger radar here. This is a radar that is good for small air parks for example. It is portable. You can put it in that box and we take it where we need it.
I’ll just point out a couple things in this picture [below]. This is similar to the systems I’ve shown in the prior pictures. We have here a medical supply delivery drone from Oklahoma State University that was used in some testing. We did some tracking with that aircraft with our system. We’re excited about those kinds of projects.
Our system is really designed to allow these long-range flights and that’s really where the industry is headed. So, getting an opportunity to work with these really practical, emergent aircraft is important to us.
Just another slide here. There’s some interesting trends and sensors that have a big impact in unmanned aircraft and particularly in detect-and-avoid which is really at the cusp of the industry right now.
I currently serve on the FAA’s Aviation Rulemaking Committee, which is writing rules for beyond visual line-of-sight drones in the US. There is movement right now in the regulations in the US which is very exciting. We’re very happy to be able to contribute to that effort.
There are emerging trends with new radars smaller, lighter with longer range detection is always important. We watch that closely. Sensors, in our case, are used as part of a multi-sensor and multi-layered safety system.
There’s a certain amount of procedural safety and then you have strategic safety – where you’re going to fly, when you’re going to fly, and coordinating those efforts. Then ultimately the sensors are used for tactical safety, so that you can detect aircraft that otherwise you will not be able to know are there.
Better algorithms are emerging for trajectory prediction. Particularly machine learning is emerging as a way to filter and do target classification so you can better identify whether you’re looking at a bird or whether you’re looking at something that you shouldn’t classify as an aircraft and don’t need to worry about necessarily. That is a great area of development.
Then millimeter wave radar on a chip is something that we’re watching closely. There are emerging smaller radars that are based on some 5G technologies that have overlap with other emerging millimeter technologies. We think that that’ll be very useful for the industry.
Then, as I mentioned, there are new technical standards. It was mentioned in my bio that I serve on several technical standard writing groups. Then there are FAA regulations that are really pushing the industry forward.
There’s my contact information. Thank you very much again for the opportunity to speak.
We are happy to take any questions and to hear from you to answer additional questions about anything I’ve presented. Thank you.
To view this webinar and other on-demand webinar content from AUVSI, visit: Enhancing Operations with Sensors and Other Instruments | Association for Unmanned Vehicle Systems International (auvsi.org)