Pentagon seeks AI target recognition for counter-drone weapons

Pentagon seeks AI target recognition for counter-drone weapons

The Pentagon is seeking AI-enhanced target recognition for close-in counter-drone systems, initially focused on remote weapon stations including CROWS. The effort links computer vision, sensor fusion, prototype testing, land and maritime firing, and future small-arms integration.


IN Brief:

  • The Pentagon is seeking AI-enhanced aided target recognition for close-in counter-UAS weapons.
  • The first phase focuses on remote weapon stations, including the Common Remotely Operated Weapon Station.
  • Prototype requirements include land and maritime firing, drone detection, tracking, engagement, and future pathways for small arms.

The Pentagon is seeking AI-enhanced target recognition for close-in counter-drone systems, moving artificial intelligence deeper into the weapon engagement chain for land, maritime, and potentially dismounted operations.

The C-UAS Close-In Kinetic Defeat Enhancement project is focused on aided target recognition, or AiTR. The programme is intended to use artificial intelligence, machine learning, and computer vision to help systems detect drones, distinguish them from non-threats such as birds, and accelerate engagement decisions. The first phase centres on remote weapon stations, including the Common Remotely Operated Weapon Station, or CROWS, already fitted to a wide range of vehicles.

The requirement is being managed through the Defense Innovation Unit, with responses due by May 15. Prototypes must improve the ability of current remote weapon stations to detect, track, and engage Group 1 and Group 2 UAS, covering drones weighing 55lb and under. The system must detect at ranges beyond 600m, engage at a minimum of 100m, and address drones moving at speeds of at least 30m per second. It must also support firing in land and maritime environments.

That fielding requirement separates the project from laboratory counter-drone demonstrations. The C-UAS market is crowded with sensors, jammers, interceptors, directed-energy concepts, and kinetic systems, many of which perform well in controlled trials. Fewer have proven themselves across dust, vibration, clutter, moving platforms, ship motion, weather, electromagnetic interference, safety constraints, and mixed-threat environments. The Pentagon is looking for target recognition that can be integrated with weapon systems already in service.

CROWS offers an attractive starting point because it is widely deployed. The turret allows personnel to operate weapons from inside protected vehicles and has been integrated across multiple platform types. Adding AI-assisted recognition to that installed base could provide a scalable retrofit route without buying a new counter-UAS vehicle for every unit. Integration will still require sensors, processing hardware, software, fire-control interfaces, power, cabling, ruggedised housings, and safety interlocks to fit within existing weapon-station architectures.

The engagement problem is technically difficult. Small drones present low visual and radar signatures, irregular flight profiles, rapid manoeuvres, and short engagement windows. They can be confused with birds, debris, background clutter, or benign aircraft. An AI model has to support speed without eroding discrimination standards. False negatives allow drones through; false positives create safety, legal, and rules-of-engagement problems.

The project also shows AI moving from intelligence analysis into tactical equipment. IN Defence recently covered Pentagon agreements to bring AI vendors into classified environments in Pentagon opens classified networks to AI vendors, and a separate US Army effort around AI cyber defence in US Army accelerates AI cyber defence. The C-UAS effort places AI much closer to weapons, operators, and fast-moving targets.

That shift changes the production standard. A model hosted in a secure cloud can be updated, monitored, and governed through network controls. An AI system mounted on a vehicle or ship must survive vibration, temperature extremes, electromagnetic exposure, unreliable connectivity, and battlefield maintenance. It needs a controlled update process, operator interface, confidence scoring, and safe fallback behaviour when sensors are degraded.

Maritime firing creates an additional test case. Shipboard counter-drone engagements involve horizon effects, wave motion, salt corrosion, crowded waterways, radar clutter, and deck safety. A system that works on a static vehicle may need adaptation for patrol craft, support ships, or naval installations. A common aided-recognition layer across land and maritime systems would need strong calibration, configuration, and environmental qualification processes.

A later small-arms phase would be more ambitious again. Integrating aided target recognition with dismounted weapons raises weight, power, optics, display, latency, safety, and human-control questions. Soldiers already carry heavy loads, and any AI targeting aid would need to be compact, intuitive, rugged, and reliable enough to earn its place. It would also need to support the operator without creating distraction during fast engagements.

For manufacturers, the opportunity spans a large installed base. Remote weapon stations, tactical vehicles, base-defence systems, and naval platforms could all become retrofit candidates if the technology proves reliable. Production demand would fall across cameras, thermal imagers, processors, weapon-station interface kits, software licences, training simulators, range testing, and sustainment support.

The counter-drone market is also moving away from single-solution architectures. Jamming remains useful, but it cannot always be used. Missiles and interceptors can be expensive against cheap drones. Guns and remote weapon stations offer lower-cost kinetic defeat if detection and fire control can keep pace. AI-assisted recognition links existing weapons to a threat set that is becoming faster, cheaper, and more numerous.

Field realism will determine the value of the prototypes. The Pentagon’s land and maritime firing requirement sets a useful threshold, because close-in drone defeat has little margin for elegant but fragile systems. Suppliers now have to prove that software can turn existing weapon stations into reliable drone-defeat tools without creating an integration and sustainment burden that units cannot absorb.


  • Greenerwave and Telespazio target sovereign SATCOM terminals

    Greenerwave and Telespazio target sovereign SATCOM terminals

    Greenerwave and Telespazio France have signed a strategic agreement to distribute low-power, multi-orbit SATCOM terminals across European defence and government markets. The partnership links sovereign communications, GEO/LEO resilience, antenna manufacturing, and operational continuity for critical missions.


  • Pentagon seeks AI target recognition for counter-drone weapons

    Pentagon seeks AI target recognition for counter-drone weapons

    The Pentagon is seeking AI-enhanced target recognition for close-in counter-drone systems, initially focused on remote weapon stations including CROWS. The effort links computer vision, sensor fusion, prototype testing, land and maritime firing, and future small-arms integration.