by Chris Thatcher

 

How do you train an artificial intelligence algorithm? Feed it data; lots and lots of data.

“For the AI to work well, it needs variety,” explained Dr. Mélanie Breton, a defence scientist with the Tactical Surveillance and Reconnaissance section of Defence Research and Development Canada (DRDC) as first a Light Armoured Vehicle (LAV 6) and then a Tactical Armoured Patrol Vehicle (TAPV) executed circles and tight figure eights in front of her and Major Rick Parent, project director for Joint Fires Modernization.

As the combat vehicles cut patterns through the snow in a clearing at the edge of the Canadian Forces Base Valcartier training area, multiple cameras tracked their movement. On a platform attached to a large van, two Coral-C and Coral-CRC handheld thermal imaging sensors, six infrared (IR) and electro-optical (EO) cameras and a half dozen low cost commercial cameras – “most under $30, one on eBay for $200, and all with very nice images,” said Breton – panned back and forth while overhead, two commercial quad copters with IR and EO lenses dipped and arced around the vehicles. 

The elaborate dance of vehicles and cameras was part of a data collection exercise for a DRDC Valcartier project known as Joint Algorithmic Warfighter Sensors (JAWS) under a larger Canadian Army science and technology portfolio called Empowered Dispersed Operations in the Digital Age.

As a light fog interspersed with snow descended over the training area that mid-February morning, scientists gradually built a dataset of Army vehicles and personnel imagery. Once the LAV and TAPV from 1er Battalion Royal 22e Régiment and 5e Régiment d’artillerie légère du Canada completed their circuits, a G-Wagon traced a similar pattern, followed by a snowmobile and eventually dismounted soldiers wearing snowshoes and pulling a sled. 

The process would be repeated in the afternoon, giving Breton and her DRDC Valcartier Research Centre colleagues an array of images of each platform from multiple angles and under different light and weather conditions. The soldiers from 1erR22eR repeated their circuit through the clearing and against a treeline in winter white gear and in CADPAT, changing from in-service to foreign weapons as they went to present a different profile.

“What we want to know is whether we can differentiate between friend and foe,” explained Breton. 

By amassing imagery of Canadian vehicles and teaching the AI algorithm to recognize and correctly identify the make and model, even when weapons are covered or fuel and water cans are strapped to the sides, DRDC is creating a dataset of “friendly” images.

The imagery of soldiers clambering over snow banks and adopting watch positions is intended to teach the AI to distinguish human forms that are walking, kneeling or lying prone – at present, a prone soldier is almost indistinguishable from an animal to an AI algorithm. 

“All of these cameras have different image qualities and resolutions,” said Breton. “In real life, we don’t always have a perfect lab image, so if we want the AI to work on these types of images, we need them in the datasets.” 

Despite the fluctuating temperatures that periodically disrupted the performance of the sensitive equipment, she said the field experience was invaluable for the researchers. “In AI, we don’t want to be the disconnected scientist back in our lab. We need to be close to the real data, close to the user need. The AI field goes so fast, so we have to be talking to everyone.”

The next step is labelling the information in each image, a “mind numbing” task of manually identifying vehicles, soldiers and weapons. An AI algorithm won’t recognize something it hasn’t seen before, said Valerie Lavigne, a defence scientist who has worked with teams of military personnel that have spent hours painstakingly drawing boxes around image objects and labelling them. “It can extrapolate, but you won’t know for sure.” 

Soldiers with expertise in each platform have spent a few weeks at a time helping with the machine learning process – “no more than two weeks, we care about their mental health,” she quipped. As the algorithm computes an output, the scientists are able to adjust its response with the soldiers’ correct answer. “We do that a couple million times, back and forth, until it gives the right answer every time.”

Lavigne has been developing a way to “gamify” the process, perhaps like Pokémon, so that soldiers earn points or rewards as they box and label the data. 

Ultimately, though, the Army will need a less manpower-intensive way to tag data. “The most capable commercial AI applications have teams of people manually labelling millions of images on a daily basis. We simply do not have the capacity to do that,” said Lieutenant-Commander Mike Nelson, an innovation officer for the Directorate of Land Requirements. “We will have to figure out how to train AI efficiently so that it frees up our soldiers to work on higher order tasks, rather than increase the burden on an already resource-strained army.”

Photo: DRDC

WATCHING THE WATCHERS

As the drones recorded the vehicles and soldiers navigating the training area, off to the side five EO cameras in two additional boxes tracked the drones, panning and tilting with their movements. With the abundance of unmanned systems now flying above the modern battlefield, DRDC is exploring ways to autonomously and automatically identify friend from foe or even fowl and “took the opportunity to be here to capture and track the signatures of the drones,” said Guillaume Gagne, a defence scientist on a project called Defeat Autonomous Systems. 

Like Breton and Lavigne, his small team was nestled in a van, fine-tuning the tracking software so it would consistently select one of the two drones and lock on to its flight. “We can switch from one feed to another to perform tracking and select the target we want to track,” said Gagne as he toggled between sensor feeds. 

The data will be incorporated into a counter-UAS system to address the threat of mini and micro drones – the Army’s Ground Based Air Defence project will likely tackle the threat of larger UAS – that could include jamming via electronic warfare, taking control through cyber systems, or kinetic counter measures to disable the platform such as dispersed shot, air bursts or high energy lasers, he said. An AI-trained algorithm able to identify and track drones with a passive EO sensor rather than active radar could mean a distinct advantage.

 

LINKING SENSOR TO SHOOTER

The application of artificial intelligence to help reduce the cognitive burden on soldiers tasked with detection and identification responsibilities has a lot of interest across the Canadian Armed Forces (CAF). The data collection exercise was sponsored by the Joint Fires Modernization (JFM) project under the JAWS project to understand and validate how AI can be incorporated into joint fires and support other sensor-centric projects such as ISR Modernization.

“It is the first step in automating the sensor-to-shooter link and the fires decision-action cycle,” said Parent, a recent artillery battery commander and forward observation officer (FOO). “This is the first task our operators conduct: acquire the target. They are on the battlefield supporting manoeuvre forces, and by getting a location, type, size, activity and degree of protection of the adversary’s systems, those become targets and spurs into action the analysis and coordination to execute a strike. That is our bread and butter as fire supporters, the observation role to then decisively coordinate lethal and non-lethal effects.” 

However, the process of observing, correctly identifying, and then communicating a target is prone to human error. At present, a FOO whether dismounted with all of their kit on their backs, or mounted in an observation post vehicle, will flip through an Aide-Memoire booklet with pictures of vehicles and weapons, geo-locate it on a paper map and relay this target information as a call for fire to a signaller within the party, who will in turn copy it down prior to sending by voice to the artillery unit for prosecution. There, the call for fire is written down and manually entered into a ballistic computer.

“That is four air gaps for one piece of tactical information,” noted Parent. “Those are all steps that have the risk of imposing human error resulting in delays or worse – rounds landing not on the intended target.” 

“And that presupposes that the communications are working well, that there is a common language without accents,” added Christian Légère, a former signaller and now the JFM deputy project manager in the Assistant Deputy Minister Materiel Group. “The potential for introducing errors at each of those steps is multiplied every time you have to manually do one of these tasks.” 

Said Parent: “If we can use artificial intelligence technology to augment the operators in their detection, recognition, and identification tasks, we are going to be that much more efficient, accurate and lethal. And it’s not only artillery observers; it’s reconnaissance, snipers, radars – anything that is looking out into a sector of the battlespace. Why not have the machine augment the user to drop at least one of the air-gaps?”

Photo: Stephen Berry

WITH AND WITHOUT DATA

To date, the focus of AI algorithms in target acquisition has been primarily on visual-spectrum applications such as automated EO full motion video (FMV) analysis for close-proximity intelligence collection of civilian pattern vehicles and individuals from close ranges. With the proliferation of thermal sensor capabilities across the Army, and the difficulty of “hiding or masking” human and vehicle signatures, AI could dramatically change even the most basic tasks. “It is incredibly expensive, takes advanced technology, and discipline to hide in the IR spectrum. This is why we absolutely rely on our modern thermal optics on operations,” noted Parent.

The tedious task of conducting turret watch within an armoured vehicle would be less fatiguing for crews if AI was able to assist with detection and trigger an audible alert when it detected an anomaly, he suggested. “The thermal in a LAV is really good. With that initial detection, it could cue the observer to look in, and if he or she wasn’t able to tell exactly what it is, then they could alert another asset like a UAS or vehicle on another hill.”

In fact, with AI assistance, ground-based EO/IR target acquisition systems that are typically collocated with soldiers could be dispersed as unattended ground sensors to enhance force protection.

While AI machine learning and computer vision algorithms currently depend on large quantities of data, the JAWS team isn’t assuming unfettered access to data. In fact, they are actively considering the realities of a conflict with a near-peer adversary and battlespace where the CAF is not necessarily “swimming in sensors, drowning in data,” but rather starved for data due to adversarial counter-ISTAR capabilities and unknowns regarding the physical environment, said Nelson.

“Our adversaries have robust Anti-Access Area Denial (A2AD) capabilities in the space, air, land, and maritime domains, as well as the information domain. In this reality, we are not likely to have the ability to freely collect the large quantities of data that we have become accustomed to over the last 20 years. As we look to build AI into our in-service and future sensor suites, we have to take this into account. How can AI operate reliably in these environments given the ‘fog’ of the future operating environment?”

For projects like JFM and ISR Mod, leveraging DRDC’s work in embedded AI algorithms and deep machine learning could be a way to future-proof the sensor equipment each will procure in the coming years. 

Among other things, JFM will replace the Coral-CRC with a solution that Parent hopes will consolidate a modern thermal imager into a single form factor with an EO camera, laser range finder, laser marker and fire control software, while also reducing the number of batteries and cables strapped to the observer. At the same time, a project to modernize the night vision systems for infantry, snipers, armoured reconnaissance and others is exploring many of the same capabilities for the larger Army. 

Both projects are talking to the same vendors, Parent noted, which could result in a common solution for all or variants of a common system that can be tailored for each community. Though a final decision is still a few years off as the two projects move through the procurement process, one question they will have to answer soon is how to incorporate artificial intelligence applications: Is it embedded in the system or tethered/edge-computed? The former might provide instance analysis but would be a large power draw. 

“For JFM, we might give the users the option,” said Parent. “They could have it embedded within their optic within the system, but they also could have it on their tablet, almost like a picture-in-picture displayed as an overlay of our touch-driven command and control software (similar to Google Maps). This way you can pop-up a UAS feed and it would have the AI algorithm running while you leverage the tablet for its processing power.”

Whatever thermal sensor systems the Army eventually acquires, the datasets built by DRDC will be part of an AI solution. “When they are ready, we will make sure to provide the datasets,” said Breton of future vendor interest. “It’s multi-use. If a private company comes to us and says, ‘we have a good AI model,’ we’ll have a dataset to test it against. We have a huge library of datasets. No one has that yet. And we are learning through the process. Even now we are doing things differently than in the fall.” 

Like most industrial sectors struggling to understand how best to apply AI, the Army will face a number of challenges. But both Nelson and Parent believe it’s critical the Army not wait to solve every one before deploying AI operationally. 

“Typically, we are concerned with how we employ at scale across the entirety of the Army. This is important to consider, but the current operational tempo allows us an opportunity to learn by doing,” said Nelson. “So, let’s get the technology into the hands of our soldiers on exercises and operations now, and have our scientists and engineers work with them in real-time to improve the technology, which, even in its current state, is sufficiently mature for it to deliver value. It is only through active experimentation that we will gain the knowledge required to future-proof the sensors we are procuring; to ensure that not only are our systems AI-ready or AI-enabled, but that they can evolve with best-of-breed AI technologies over time.”