Search Contract Opportunities

Visual Position and Navigation Capability Using Computer Vision for SUAS in GPS-Denied Environments

ID: AF254-D0815 • Type: SBIR / STTR Topic • Match:  85%
Opportunity Assistant

Hello! Please let me know your questions about this opportunity. I will answer based on the available opportunity documents.

Please sign-in to link federal registration and award history to assistant. Sign in to upload a capability statement or catalogue for your company

Some suggestions:
Please summarize the work to be completed under this opportunity
Do the documents mention an incumbent contractor?
Does this contract have any security clearance requirements?
I'd like to anonymously submit a question to the procurement officer(s)
Loading

Description

OUSD (R&E) CRITICAL TECHNOLOGY AREA(S): Advanced Computing and Software The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with the Announcement. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws. OBJECTIVE: The primary objective is to develop a robust and reliable visual navigation system that ensures uninterrupted drone operations in environments where GPS signals are unavailable, degraded, or subject to jamming. The proposed technology should utilize advanced computer vision algorithms to analyze visual data from the drone's onboard camera, detect and recognize skylines and terrain features, and match these features against a precomputed and preprocessed satellite data repository. This system must deliver high accuracy, with geolocation precision within five meters, ensuring mission success in challenging operational scenarios. DESCRIPTION: This topic seeks to develop a software-only visual position and navigation capability using computer vision, tailored for deployment on commercial off-the-shelf (COTS) drones operating in GPS-denied environments. The desired solution should leverage existing cameras, storage, and computational resources on these drones to provide accurate, real-time navigation and positioning without the need for additional hardware. PHASE I: In order to substantiate that the proposer's technology is currently at an acceptable stage to award a Direct to Phase 2 (D2P2) contract, a previously completed feasibility study is expected. This study should have demonstrated the technology's ability to address key requirements such as compatibility with a wide range of COTS drones, terrain feature detection and matching, data security and resilience against cyber threats, and feasibility of the technology through simulations and field tests. By providing evidence of a completed feasibility study that addresses these key requirements, the proposer can demonstrate that their technology is currently at an acceptable stage to award a D2P2 contract. PHASE II: This topic seeks to develop a software-only visual position and navigation capability using computer vision, tailored for deployment on commercial off-the-shelf (COTS) drones operating in GPS-denied environments. The desired solution should leverage existing cameras, storage, and computational resources on these drones to provide accurate, real-time navigation and positioning without the need for additional hardware. Deploying a visual navigation system on COTS drones significantly enhances the operational capabilities of the Air Force by providing a resilient alternative to GPS-based navigation. This software solution allows for rapid integration across various drone platforms, eliminating the need for specialized hardware modifications. The capability to maintain accurate positioning and navigation in GPS-denied environments is crucial for reconnaissance, surveillance, and logistics missions, particularly in contested or remote areas. By leveraging existing drone sensors and computing power, the proposed technology ensures cost-effective scalability and operational flexibility. The proposed solution must be compatible with a wide range of COTS drones, utilizing their onboard cameras and computational resources to minimize additional weight and power consumption. The system should employ machine learning and computer vision techniques to achieve terrain feature detection and matching. It must be capable of operating under diverse environmental conditions, including urban canyons, dense foliage, and varied lighting. Additionally, the software should provide easy integration through an API, supporting rapid deployment and updates, and ensure data security and resilience against cyber threats. The solution should demonstrate the feasibility of the technology through simulations and field tests, showcasing the system's performance and reliability in relevant operational scenarios as well as integration with Android Tactical Assault Kit (ATAK). PHASE III DUAL USE APPLICATIONS: The expected Phase III effort for this project would involve further development, testing, and refinement of the software-only visual position and navigation capability using computer vision. This would entail optimizing the software to leverage existing cameras, storage, and computational resources on commercial off-the-shelf (COTS) drones, ensuring compatibility with a wide range of drone platforms. The software would need to employ machine learning and computer vision techniques to achieve terrain feature detection and matching, with the ability to operate under diverse environmental conditions, such as urban canyons, dense foliage, and varied lighting. Additionally, the software should provide easy integration through an API, supporting rapid deployment and updates, and ensure data security and resilience against cyber threats. The expected TRL at Phase III entry would be around TRL 6-7, indicating that the technology has been demonstrated in a relevant environment, and is ready for deployment in an operational environment. This would entail the successful integration of the software with various COTS drone platforms, as well as its compatibility with the Android Tactical Assault Kit (ATAK). In terms of transition planning, the project would need to address regulatory compliance, such as ensuring the software adheres to data privacy and security regulations. Additionally, the project would need to consider the development of a business or transition plan, outlining the strategy for commercialization or broader adoption of the technology. This would include identifying potential markets, partners, and customers, as well as a plan for ongoing support, maintenance, and updates to the software. Furthermore, collaboration with drone manufacturers and operators would be crucial to ensure seamless integration and adoption of the technology. Potential commercial and private industry applications for the proposed technology include precision navigation, reconnaissance, search and rescue, and commercial vision metadata tagging. The technology could be used to guide autonomous drones for surveillance and tactical support in military operations, improving situational awareness and mission success. In first responder cases, the technology could monitor and guide autonomous search and rescue equipment, improving safety and efficiency. In commercial vision metadata tagging, location accuracy within feet/meters is required for various applications, such as image geotagging and object tracking. By addressing these needs, the proposed technology has the potential to be a viable solution for various industries, providing a resilient alternative to GPS-based navigation. The technology's better vision and location capabilities without the use of a GPS could also lead to significant energy savings, reduce costs, and avoid sensitive areas such as airports or flying below certain altitudes for legal reasons. REFERENCES: 1. J. Kim, T. Gregory, J. Freeman and C. M. Korpela, "System-of-Systems for Remote Situational Awareness: Integrating Unattended Ground Sensor Systems with Autonomous Unmanned Aerial System and Android Team Awareness Kit," SPIE Defense + Security, Baltimore, Maryland, United States, 2014, pp. 90750A-90750A-12. 2. F. Cappello, S. Ramasamy and R. Sabatini, "A low-cost and high performance navigation system for small RPAS applications," Aerospace Science and Technology, vol. 58, pp. 529-545, 2016, doi: 10.1016/j.ast.2016.09.017. 3. A. Appleget, J. Watson, J. Gray and C. Taylor, "Navigating a sUAS without GNSS," Inside GNSS, May 29, 2023. [Online]. Available: https://insidegnss.com/navigating-a-suas-without-gnss/ 4. J. Kim, K. Lin, S. M. Nogar, D. Larkin and C. M. Korpela, "Detecting and Localizing Objects on an Unmanned Aerial System (UAS) Integrated with a Mobile Device," 2021 International Conference on Computing, Networking and Communications (ICNC), San Diego, CA, USA, 2021, pp. 546-550. 5. M. Uijt de Haag, S. Huschbeck and J. Huff, "sUAS Swarm Navigation using Inertial, Range Radios and Partial GNSS," 2019 IEEE/AIAA 38th Digital Avionics Systems Conference (DASC), San Diego, CA, USA, 2019, pp. 1-8, doi: 10.1109/DASC43721.2019.9091029. 6. La and M. Matson, "ATAK Integration through ROS for Autonomous Air-ground Team," 2021 IEEE International Systems Conference (SysCon), Vancouver, BC, Canada, 2021, pp. 1 5, doi: 10.1109/SysCon48628.2021.9476676. KEYWORDS: SUAS, ALT-PNT, Computer Vision on SUAS

Overview

Response Deadline
June 25, 2025 Past Due
Posted
May 12, 2025
Open
May 12, 2025
Set Aside
Small Business (SBA)
Place of Performance
Not Provided
Source
Alt Source

Program
SBIR Phase I / II
Structure
Contract
Phase Detail
Phase I: Establish the technical merit, feasibility, and commercial potential of the proposed R/R&D efforts and determine the quality of performance of the small business awardee organization.
Phase II: Continue the R/R&D efforts initiated in Phase I. Funding is based on the results achieved in Phase I and the scientific and technical merit and commercial potential of the project proposed in Phase II. Typically, only Phase I awardees are eligible for a Phase II award
Duration
6 Months - 1 Year
Size Limit
500 Employees
On 5/12/25 Department of the Air Force issued SBIR / STTR Topic AF254-D0815 for Visual Position and Navigation Capability Using Computer Vision for SUAS in GPS-Denied Environments due 6/25/25.

Documents

Posted documents for SBIR / STTR Topic AF254-D0815

Question & Answer

The AI Q&A Assistant has moved to the bottom right of the page

Contract Awards

Prime contracts awarded through SBIR / STTR Topic AF254-D0815

Incumbent or Similar Awards

Potential Bidders and Partners

Awardees that have won contracts similar to SBIR / STTR Topic AF254-D0815

Similar Active Opportunities

Open contract opportunities similar to SBIR / STTR Topic AF254-D0815