OUSD (R&E) CRITICAL TECHNOLOGY AREA(S): Trusted AI and Autonomy The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with the Announcement. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws. OBJECTIVE: DTRA seeks an innovative AI-driven solution for detecting, tracking, and guiding an unmanned aerial system (UAS) through airborne plumes generated during explosive tests. The system should integrate data from onboard systems and ground systems to include LIDAR and radar to autonomously identify plume characteristics, maintain situational awareness in changing environmental conditions, and provide precise steering capabilities to optimize data collection. The software should be platform and sensor suite agnostic. The innovation seeks to improve data collection during testing with potential expansion to operational use cases requiring plume tracking. Recent advances in AI/ML and computer vision have significantly improved autonomous tracking and classification capabilities, making this a feasible solution for real-time operations in hazardous environments [1,2]. DESCRIPTION: Current methods for tracking plumes in explosive test environments rely on manual UAS operation, which is inefficient and susceptible to human error. Explosive tests incorporate numerous other sensor systems such as LIDAR and radar, which are currently unused for UAS operations. Furthermore, rapidly changing weather conditions and unpredictable plume behavior make consistent data collection challenging. Advances in AI/ML and sensor fusion techniques allow for real-time plume detection and classification, leveraging computer vision algorithms trained on EO/IR, LIDAR, and hyperspectral imaging data [3,4]. The proposed system should: Provide a platform and sensor agnostic algorithm to fuse multiple data inputs quickly to inform UAS operations. Utilize AI/ML to distinguish relevant plumes from background interference (e.g., dust, clouds, smoke) [5]. Enable autonomous UAS flight control for real-time plume tracking and sampling. Provide robust data security and transmission capabilities for real-time operational use. Be adaptable to various UAS platforms and modular for integration into existing explosive test frameworks. PHASE I: PHASE I Proof of Concept Create a proposed technical roadmap for integrating AI/ML-driven plume tracking capabilities with UAS systems. Additionally, the proposal should have a computer vision model for real-time plume identification with the ability to discern similar items, such as clouds or ground terrain. The framework will explain integration of other sensor data (LIDAR, radar) to inform UAS operations. A defined security data framework is needed to protect collected information, which may be used later for model development. Initial developments can be based on simulation testbed activities to validate AI-driven tracking algorithms. Deliverables will include a proof-of-concept report, feasibility study, preliminary AI/ML model, and system architecture plan. A plan outlining the approach for scaling the system to meet Phase II requirements should also be submitted. PHASE II: Phase II - Prototype Development: Demonstrate a scaled-down working prototype that integrates AI/ML algorithms, on board and ground sensors, and real-time UAS control. This should be done through live-scale testing in a controlled environment, thus demonstrating tracking capabilities in various plume conditions. Metrics related to AI performance and optimization are required with proposed timelines and approaches to improvement. Considerations for different environments and weather conditions will be presented. Last, data collection and transmission from UAS to the ground control systems, as well as the data management for model training, will be demonstrated. Deliverables will include a fully functioning prototype (using UAS of performer choice), live demonstration results, detailed AI model evaluation, and sensor integration report. A design plan should also be submitted outlining the plans for scaling the system to meet Phase III requirements. PHASE III DUAL USE APPLICATIONS: Demonstrate system capabilities in a full-scale test event at either a DTRA explosive test site or in a suitably similar test site and environment. Showcase finalized system integration for seamless operation within existing test range infrastructures and UAS equipment. Develop commercialization pathways, including applications in hazardous material detection, industrial emissions monitoring and emergency response. Deliverables: fully operational system, final validation report, commercialization and transition plan. REFERENCES: 1. Zhang, H., & Li, X. "In-depth review of AI-enabled unmanned aerial vehicles." Springer, 2024. https://link.springer.com/article/10.1007/s44163-024-00209-1 2. Smith, J., & Doe, A. "Recent Advances in Artificial Intelligence and Computer Vision for Unmanned Aerial Vehicles." ResearchGate, 2021. https://www.researchgate.net/publication/355241822_Recent_Advances_in_Artificial_Intelligence_and_Computer_Vision_for_Unmanned_Aerial_Vehicles 3. Brown, T., & Green, L. "Systematic literature review of AI algorithms applied to unmanned aerial vehicles." Taylor & Francis, 2024. https://www.tandfonline.com/doi/full/10.1080/19479832.2024.2382737?af=R 4. Redmon, J., & Farhadi, A. "Deep Learning Computer Vision Algorithms for Real-time UAVs On-board Camera Image Processing." arXiv preprint arXiv:2211.01037, 2022. https://arxiv.org/abs/2211.01037 5. Patel, R., & Kim, J. "Deep Learning for UAV-based Object Detection and Tracking: A Survey." arXiv preprint arXiv:2110.12638, 2021. https://arxiv.org/abs/2110.12638 KEYWORDS: predictive analytics, information environment, force protection