Search Prime Grants

2335553

Project Grant

Overview

Grant Description
Sttr Phase I: Weed Control via Terradynamically Robust Robots -this small business technology transfer (STTR) Phase I project develops a robotic platform that can provide automated weed control throughout crop development stages. In recent years, weed control costs have been growing due to the rise of herbicide-resistant weeds and the increase in costs of agricultural labor.

Additionally, increased demand for fruits and vegetables leaves these specialty crop farmers struggling to find options to increase productivity while keeping expenses manageable. Several companies offer automated weed control in vegetables; however, these large platforms struggle not to damage fruit in berry orchards.

This project aims to develop swarms of robotic devices that can operate underneath the plant canopy to provide mechanical weed control for berries throughout the year without impacting plant growth. This technological development will enable domestic fruit production to meet the growing consumer demand and allow for less chemical use in fruit production, reducing herbicide-associated health risks to farm workers and consumers.

Long term, these devices can augment weed control strategies in other crops and perform different agricultural tasks such as fungicide treatments and plant health monitoring, with the goal of automating agriculture to be more efficient and sustainable. This small business technology transfer project aims to develop rugged, low-to-the-ground, multi-legged robots that can locomote in various agricultural fields.

This technology builds off of recent works demonstrating the effectiveness of centipedes and centipede-like robots when traveling over diverse terrains. When properly coordinated, these mechanically redundant legged systems demonstrate robust locomotion in complex terrain without the need for sensory feedback. This project will leverage this platform and perform systematic robot experimentation and theoretical modeling to develop coordination schemes for various maneuvers in agricultural terrain analogues.

These strategies will then be implemented on a hardened robot to reliably locomote beneath the canopy in crop fields and identify weeds using an onboard camera and computer vision techniques. This device will make use of low-cost components and principles of mobility in complex environments to deliver guaranteed locomotion in these unpredictable terrains.

Eventually, swarms of these devices will be deployed on various crop fields to provide autonomous weed management throughout the growing season, decreasing the costs of production for farmers and consumers. This project will result in a robust robotic platform that can provide cheap, reliable, all-terrain locomotion and such a device can extend beyond agriculture to address other U.S. sectors such as search-and-rescue and defense. This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

- Subawards are planned for this award.
Funding Goals
THE GOAL OF THIS FUNDING OPPORTUNITY, "NSF SMALL BUSINESS INNOVATION RESEARCH (SBIR)/ SMALL BUSINESS TECHNOLOGY TRANSFER (STTR) PROGRAMS PHASE I", IS IDENTIFIED IN THE LINK: HTTPS://WWW.NSF.GOV/PUBLICATIONS/PUB_SUMM.JSP?ODS_KEY=NSF23515
Place of Performance
Atlanta, Georgia 30328-2784 United States
Geographic Scope
Single Zip Code
Analysis Notes
Amendment Since initial award the End Date has been extended from 01/31/25 to 09/30/25.
Ground Control Robotics was awarded Project Grant 2335553 worth $275,000 from in February 2024 with work to be completed primarily in Atlanta Georgia United States. The grant has a duration of 1 year 7 months and was awarded through assistance program 47.084 NSF Technology, Innovation, and Partnerships. The Project Grant was awarded through grant opportunity NSF Small Business Innovation Research / Small Business Technology Transfer Phase I Programs.

SBIR Details

Research Type
STTR Phase I
Title
STTR Phase I: Semantically-Enabled Augmented Reality for Manufacturing
Abstract
This Small Business Technology Transfer (STTR) Phase I project facilitates safer and more efficient human-centered manufacturing tasks. The introduction of context-sensitive work guidance through immersive technologies will expedite workforce training, enhance users' spatial awareness, and outperform existing manufacturing work instruction systems, leading to heightened productivity across industries. This development embodies the emergence of cyber-human relationships and Digital Twin and Smart Factory applications, reinforcing U.S. manufacturing leadership, bolstering economic competitiveness, and fortifying national security. The anticipated commercial Platform-as-a-Service (PaaS) solution is poised to benefit approximately 10,000 U.S. manufacturing firms. Beyond its economic implications, the first-generation, open-specification Reality Modeling Language (RML) developed in this project is expected to gain widespread acceptance in the international standards community, improving spatial system automation across diverse industry verticals. Ultimately, this system will render the physical world more accessible, searchable, and comprehensively annotated with data, unlocking new frontiers in user support, safety, and efficiency. This Small Business Technology Transfer (STTR) Phase I project addresses mission-critical challenges for fully leveraging Augmented Reality (AR) tools in manufacturing environments. It draws upon ontologically structured data and a proprietary Artificial Intelligence (AI)-driven knowledge system for automating the generation and display of context-specific AR content in 3D space, eliminating the need for individually designed AR interactions. The solution enables training and work instruction systems to become spatially- and contextually aware, in order to adapt to dynamic conditions impacting worker safety and efficiency. The objective of this project is to demonstrate and quantify how automatically generated, spatially- and semantically aware AR can provide work guidance, machine status data, and hazard warnings to increase worker capabilities versus conventional guidance tools. The RML will be derived and logically describe and computationally code the 3D spatial scene of a simulated factory floor, and later, RML will be released as an open code library to the developer community. The system will sense the real world and objects in real-time, learn as input is received, and prioritize and render AR content communicating context-specific suggestions and warnings. This project will demonstrate integration between workers, their environment, and the tools engaged to complete their tasks so production personnel can act confidently, safely and effectively. This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Topic Code
M
Solicitation Number
NSF 23-515

Status
(Ongoing)

Last Modified 6/20/25

Period of Performance
2/15/24
Start Date
9/30/25
End Date
93.0% Complete

Funding Split
$275.0K
Federal Obligation
$0.0
Non-Federal Obligation
$275.0K
Total Obligated
100.0% Federal Funding
0.0% Non-Federal Funding

Activity Timeline

Interactive chart of timeline of amendments to 2335553

Transaction History

Modifications to 2335553

Additional Detail

Award ID FAIN
2335553
SAI Number
None
Award ID URI
SAI EXEMPT
Awardee Classifications
Small Business
Awarding Office
491503 TRANSLATIONAL IMPACTS
Funding Office
491503 TRANSLATIONAL IMPACTS
Awardee UEI
YB8UD9JN3PG1
Awardee CAGE
9BQR3
Performance District
GA-07
Senators
Jon Ossoff
Raphael Warnock
Modified: 6/20/25