Search Contract Opportunities

Image Labeling for Sparse and Rare Targets in Satellite Imagery for Computer Vision Applications

ID: OSD252-D02 • Type: SBIR / STTR Topic

Description

OUSD (R&E) CRITICAL TECHNOLOGY AREA(S): Trusted AI and Autonomy OBJECTIVE: Refine an existing image labeling regime to accelerate the production of datasets containing rare and diverse targets. DESCRIPTION: **All work will be conducted on classified data at the TS/SCI level.** Computer vision detector algorithms are developed using machine learning from massive quantities of labels corresponding to relevant objects in overhead satellite imagery. Machine-assisted image labeling is a technical approach that can quickly create and supply the algorithm with largely annotated datasets, however, this method tends to strengthen highly-represented samples at the expense of examples in the tail of the distribution. Furthermore, machine-assisted labeling methods struggle when applied to challenging objects of interest (OOIs) such as those below: Scarcity the OOI may have only one example per image or may have been observed in only a limited number of geographic locations, such that the sample size is too small to initiate a machine-assisted process. Lack of Distinctiveness The OOI has few distinguishable features such as shipping crates. The OOI is easily confused with similar objects, such as derivatives of a more common object. Occlusion - The OOI may only be partially showing in the image due to occlusion or intentional concealment. Natural Obfuscation - The OOI may be hard to distinguish from naturally occurring earth features. Examples include disturbed earth, construction activity, trenching, and berm building. In all the rare and challenging OOI situations above, either there are too few object representations, only partial representations, or inconsistent representations to initiate a detector or feature extractor-based method for machine-assisted approaches. This topic seeks to improve the efficiency and cost effectiveness of labeling overhead imagery for rare or challenging OOIs. Potential methods of interest include but are not limited to advances in machine-assisted labeling, improved tooling and human-labeling workflows, automated search and alignment of additional imagery that may find additional OOI examples. PHASE I: Demonstrate a basic model based labeling methodology to accelerate accurate label production on rare and otherwise constrained OOIs using the Microsoft COCO (JSON) metadata format. The combined human and tool interactions should be optimized for efficiency and production throughput with minimized human labor, and employ innovative approaches to quality control for sparse datasets. Conduct verification of the integrated prototype for quality assurance. Prepare software system for installation on classified computer systems. PHASE II: Develop a labeling system to operate on rare and challenging OOIs in panchromatic satellite imagery to streamline the creation of labeled datasets in NGA GeoCOCO metadata format extension of the Microsoft COCO format. The approach should significantly decrease the time required for conventional manual labeling methods without significantly increasing mislabeled or missed OOIs. Establish metrics that capture annotation time and quality. Collaborate with other performers to resolve unanticipated challenges and to deploy the system on a classified network for mission use. Once deployed, conduct verification and validation of the integrated product for quality assurance. Perform at practical demonstrations of dataset creation for OOIs specified by NGA. Deliver two separate documents, one consisting of lessons learned and the other detailing the technical architecture and functionality. Performers must possess TS/SCI clearances and conduct work at the same level. PHASE III DUAL USE APPLICATIONS: Enhance the methodology and tooling from Phase II to reduce to the greatest extent possible human in the loop interactions. Collaborate with NGA mission partners to apply labeling system to mission problem sets. Capture all metrics, lessons learned, and opportunities for improvement in delivered report including overall system performance and data quality metrics. REFERENCES: 1. Sager, C., Jamiesch,C., & Zscheh, P. A survey of image labelling for computer vision applications. Journal of Business Analytics 4(2), 1-20; doi:10.1080/2573234X.2021.1908861 (2021); 2. Haider, T. & Michahelles, F. Human-machine collaboration on data annotation of images by semi-automatic labeling. In Proceedings of Mensch und Computer 2021 (MuC '21), 552 556; doi: 10.1145/3473856.3473993 (2021); KEYWORDS: human-machine teaming; human-computer interaction; image annotation; image labeling; semi-automated image labeling; computer vision.

Overview

Response Deadline
May 21, 2025 Due in 7 Days
Posted
April 3, 2025
Open
April 3, 2025
Set Aside
Small Business (SBA)
Place of Performance
Not Provided
Source
Alt Source

Program
SBIR Phase I / II
Structure
Contract
Phase Detail
Phase I: Establish the technical merit, feasibility, and commercial potential of the proposed R/R&D efforts and determine the quality of performance of the small business awardee organization.
Phase II: Continue the R/R&D efforts initiated in Phase I. Funding is based on the results achieved in Phase I and the scientific and technical merit and commercial potential of the project proposed in Phase II. Typically, only Phase I awardees are eligible for a Phase II award
Duration
6 Months - 1 Year
Size Limit
500 Employees
On 4/3/25 Office of the Secretary of Defense issued SBIR / STTR Topic OSD252-D02 for Image Labeling for Sparse and Rare Targets in Satellite Imagery for Computer Vision Applications due 5/21/25.

Documents

Posted documents for SBIR / STTR Topic OSD252-D02

Question & Answer

Contract Awards

Prime contracts awarded through SBIR / STTR Topic OSD252-D02

Incumbent or Similar Awards

Potential Bidders and Partners

Awardees that have won contracts similar to SBIR / STTR Topic OSD252-D02

Similar Active Opportunities

Open contract opportunities similar to SBIR / STTR Topic OSD252-D02