Unmanned aerial vehicles (UAVs) may seem like a new advancement on the battlefield, but today’s modern UAVs – or drones as they’re commonly known – are the evolution of a technology that was used in the Vietnam War. Back then, UAVs were deployed to act as decoys in combat and launch missiles against fixed targets. However, since those initial deployments, UAVs have become more sophisticated – being leveraged to generate essential Intelligence, Surveillance, and Reconnaissance (ISR) data for the warfighter and in the delivery of effectors on the battlefield.
Today, modern UAVs are being used for a variety of use cases across the government. They’re monitoring climate change, detecting natural disasters, taking photographs, filming, and delivering medicine and goods to military bases.
But the advancement and evolution of the UAV is still in its infancy. Through the implementation of artificial intelligence (AI) and machine learning (ML) these essential military platforms are gaining new capabilities that will increase their importance to the mission.
How can for the integration of UAVs and AI/ML benefit the warfighter? How can AI and UAVs integrate to assist with natural disasters? And what barriers are there to the full adaptation of AI in the battlespace?
These – and other questions – were answered during a panel discussion entitled, “How AI/ML are Improving Drones” at Red Hat’s recent Annual Government Symposium. This panel discussion featured two leading artificial intelligence and machine learning experts, including:
- Justin Taylor, Vice President of Artificial Intelligence at Lockheed Martin
- Chris Edillion, Specialist Solution Architect at Red Hat
During their discussion, these experts went in-depth on how drones are changing the nature of battle, the reasons why AI and ML are beneficial in wildfire detection, and the current barriers at the forefront of AI adaptation.
How drones are changing the nature of battle in theater
As we discussed, UAVs have been becoming more sophisticated over the years— allowing more capabilities within the battlefield. As a result, aerial reconnaissance drones are becoming the new face of military power.
When the mission means everything, intelligent imaging ensures success. Instant access to real-time intelligence of the situation – especially when beyond the line of sight – is vital. The right information, at the right time, could mean the difference between lives saved and lives lost.
According to Taylor, “…Now [we] envision an actual development pipeline from being a software developer or a practitioner, all the way through to the platform itself. The process of doing over-the-air updates to the platforms in order to get them ready for their next mission.”
With the evolution of AI and ML, these capabilities have skyrocketed, making it safer for the warfighter. AI specialists like Lockheed Martin have been working to extend these capabilities for the men and women in the field.
“From an AI and ML perspective, you can think about it by just looking at each individual human operator. How can we make the most out of what’s in front of them to squeeze the most out of the resources and limited assets that they have on the platform and on the drones and UAVs?” Taylor asked. “We can think about each of the individual sensors that they have and how AI can help them to be able to process what we call cognitive sensing. We can think about this across any sensor modality. So there’s a breadth of applicability of bringing AI techniques to each of those sensory modalities. And it allows them to then make the most out of the limited resources that you have on each platform.”
Leveraging advanced AI tools, industry partners are working to make it possible to shift between UAV payloads while in flight – enabling the military to get the most out of every UAV and every mission. If the fog is too thick for surveillance, the UAV would switch to another sensor – ensuring that every mission generates some valuable ISR data.
But being able to switch between payloads and sensors on the fly is just the beginning. Industry partners are also working to make swarms of heterogenous UAVs and manned vehicles viable and manageable for the military – effectively giving them access to a massive, varied ecosystem of sensors that would deliver comprehensive, real-time situational awareness of the entire battlefield.
According to Taylor, “When you have limited compute, even on a single asset, being able to shift in mission would be beneficial. Having that particular UAV focused on radio frequency sensing versus now needing to shift to computer vision— that’s just the beginning. It’s expanding to multiple homogeneous UAVs. It’s also expanding to multiple heterogeneous platforms— manned and unmanned, across multiple domains. So there’s a lot of foundational technology that’s necessary to make that possible”
But the benefits of leveraging AI/ML with UAV platforms doesn’t end with the military. There are other use cases across the government where AI-enabled UAV platforms could help increase operational efficiency and even save lives.
How AI is assisting in wildfire detection
On average, there are approximately 70,000 fires per year within the United States. And each year an increasing number of acres are destroyed by these fires. According to the National Interagency Fire Center (NIFC), in 2022 alone, there were 66,255 fires that burned down 7,554,403 acres of land— making 2022 a surprisingly better year than most.
Climate challenges are an existential issue that also affect national defense. With the breakthroughs in AI and ML that have transpired in recent years for national defense, AI experts in the field have been working to transfer these new technological advances to environmental issues like Wildfires to improve their outdated technology.
According to Taylor, “…The technology that our brave men and women have been using is fairly dated. The algorithms are from the 70s and they have to use this outdated software to detect the perimeter and attempt to figure out how to marshal resources to suppress wildfires. So, what we’ve been able to do is show – in less than a year’s time – how quickly we can apply AI and ML techniques to first detect that there is a wildfire, and then do what we refer to as perimeter generations, and generate the actual perimeter so that those commanders that are trying to decide how to prosecute the fire are able to do so in an efficient way.”
From structure fires and wildland firefighting to vehicle crashes and chemical spills, each call a firefighter receives presents new dangers. Data is the ally in both eliminating the hazard and giving firefighters the greatest chance to do so unscathed. To give these brave men and women the edge they deserve, AI experts have been working to add more capabilities to wildfire drones in order to help mitigate these fires at a faster pace than the current technology will allow.
“…We’ve also been able to apply AI to predict the fire front over the expansion of that fire – because time is of the essence here. The quicker that you can get the result of an order of magnitude—switching from days to hours, compared to hours to minutes – the more efficient you will be [in] suppressing the fire. So we’ve shown how AI can help us there already with perimeter generation, and firefight behavior prediction. The next step is more on the reinforcement learning side to…plan for those resources. That’s the mission management part of this. And the wildfire problem has incredible analogies to our defense”
The barriers to AI adaptation
In today’s digital world, technology is rapidly advancing and changing—allowing more capabilities for defense and civilians alike. However, new technological ideas take time to fully implement in the workspace. And in this case, it takes time to get approvals across the military to implement new technologies.
According to Taylor, “…Each of the individual DoD services – down to the individual customer offices – have initiatives in place to make it easier, and to some degree faster, to work through those approvals and certifications. But when our big picture vision is blurring lines across domains— space, air, land, sea surface, subsurface, that’s different offices, programs, and services. So, being able to get lift from one to the other is really key that the AI breakthroughs that we have for one mission, although it represents a specific implementation, are applicable to other domains and other sensor phenomenology’s and sensor modalities.”
Taylor later explained, “Other barriers include, whenever we’re talking about AI and ML, it’s a journey of trust. It’s a journey of trust with each individual user, for that program and mission. So we recognize how important it is to have the end user and end-user community, be part of that development process upfront. So they can understand the interface between the human and machine, and be able to trust that machine by having the humans provide input into the process to begin with. And how does the human build trust? They understand what the machine is doing.”