Search Prime Grants

2242216

Cooperative Agreement

Overview

Grant Description
Sbir Phase II: Epipolar-Plane Imaging for Robot 3D Vision -The broader/commercial impact of this Small Business Innovation Research (SBIR) Phase II project seeks to improve robotic interactions with the humans.

Currently, robots are involved in large sectors of society including logistics, manufacturing, autonomous navigation, video communication, remote supervision of complex mechanical maintenance/repair tasks, support in battlefields and disasters, and interactions in various training, educational, and interventional scenarios including telemedicine.

This technology may offer more effective automation in the workplace through higher quality 3D sensing, greater precision visualization, and increased worker quality of life.

The technology addresses precision and reliability of passive 3D scene measurements.

This Small Business Innovation Research (SBIR) Phase II project addresses the acquisition of reliable and precise three-dimensional representations of a scene from passively acquired image data for use in navigation, grasping, manipulation, and other operations of autonomous systems in unrestricted three-dimensional spaces.

This technology has been a long-standing challenge in the computer vision field, with many efforts providing adequate solutions under certain conditions, but lacking applicability across a breadth of applications.

Other approaches typically deliver inaccurate results where there are, for example, repeated structures in the view, thin features, a large range in depth, or where structures align with aspects of the capture geometry.

Based on the matching of features across images, current technologies fail when features have similar appearance.

This technology removes the uncertainty of this process through a low-cost use of over-sampling, using a specific set of additional perspectives to replace the "matching" with deterministic linear filtering.

Increasing the reliability and precision of 3D scene measurements will open new opportunities for robotic interactions with the world.

Success in this project will advance the underlying light-field technology to broader application areas where human-in-the-loop operations using artificial reality/virtual reality (AR/VR) or mixed reality (such as remote collaboration and distance interaction) depend on accurate and responsive visualization and scene modeling, reducing influences of vestibular and proprioceptive mismatch that can cause disruptive effects such as nausea.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

- Subawards are not planned for this award.
Awardee
Funding Goals
THE GOAL OF THIS FUNDING OPPORTUNITY, "NSF SMALL BUSINESS INNOVATION RESEARCH PHASE II (SBIR)/ SMALL BUSINESS TECHNOLOGY TRANSFER (STTR) PROGRAMS PHASE II", IS IDENTIFIED IN THE LINK: HTTPS://WWW.NSF.GOV/PUBLICATIONS/PUB_SUMM.JSP?ODS_KEY=NSF22552
Awarding / Funding Agency
Place of Performance
Los Altos, California 94024-3827 United States
Geographic Scope
Single Zip Code
Related Opportunity
22-552
Analysis Notes
Amendment Since initial award the total obligations have increased 2% from $999,407 to $1,015,407.
Epiimaging was awarded Cooperative Agreement 2242216 worth $1,015,407 from National Science Foundation in September 2023 with work to be completed primarily in Los Altos California United States. The grant has a duration of 2 years and was awarded through assistance program 47.084 NSF Technology, Innovation, and Partnerships.

SBIR Details

Research Type
SBIR Phase II
Title
SBIR Phase II: Epipolar-Plane Imaging for Robot 3D Vision
Abstract
The broader/commercial impact of this Small Business Innovation Research (SBIR) Phase II project seeks to improve robotic interactions with the humans.Currently, robots are involved in large sectors of society including logistics, manufacturing, autonomous navigation, video communication, remote supervision of complex mechanical maintenance/repair tasks, support in battlefields and disasters, and interactions in various training, educational, and interventional scenarios including telemedicine. This technology may offer more effective automation in the workplace through higher quality 3D sensing, greater precision visualization and increased worker quality of life. The technology addresses precision and reliability of passive 3D scene measurements. _x000D_ _x000D_ This Small Business Innovation Research (SBIR) Phase II project addresses the acquisition of reliable and precise three-dimensional representations of a scene from passively acquired image data for use in navigation, grasping, manipulation, and other operations of autonomous systems in unrestricted three-dimensional spaces. This technology has been a long-standing challenge in the computer vision field, with many efforts providing adequate solutions under certain conditions, but lacking applicability across a breadth of applications. Other approaches typically deliver inaccurate results where there are, for example, repeated structures in the view, thin features, a large range in depth, or where structures align with aspects of the capture geometry. Based on the matching of features across images, current technologies fail when features have similar appearance. This technology removes the uncertainty of this process through a low-cost use of over-sampling, using a specific set of additional perspectives to replace the “matching” with deterministic linear filtering. Increasing the reliability and precision of 3D scene measurements will open new opportunities for robotic interactions with the world. Success in this project will advance the underlying light-field technology to broader application areas where human-in-the-loop operations using artificial reality/virtual reality (AR/VR) or mixed reality (such as remote collaboration and distance interaction) depend on accurate and responsive visualization and scene modeling, reducing influences of vestibular and proprioceptive mismatch that can cause disruptive effects such as nausea._x000D_ _x000D_ This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Topic Code
MO
Solicitation Number
NSF 22-552

Status
(Complete)

Last Modified 4/4/25

Period of Performance
9/15/23
Start Date
8/31/25
End Date
100% Complete

Funding Split
$1.0M
Federal Obligation
$0.0
Non-Federal Obligation
$1.0M
Total Obligated
100.0% Federal Funding
0.0% Non-Federal Funding

Activity Timeline

Interactive chart of timeline of amendments to 2242216

Transaction History

Modifications to 2242216

Additional Detail

Award ID FAIN
2242216
SAI Number
None
Award ID URI
SAI EXEMPT
Awardee Classifications
Small Business
Awarding Office
491503 TRANSLATIONAL IMPACTS
Funding Office
491503 TRANSLATIONAL IMPACTS
Awardee UEI
FMCEN5SWM8L3
Awardee CAGE
7R2L9
Performance District
CA-16
Senators
Dianne Feinstein
Alejandro Padilla

Budget Funding

Federal Account Budget Subfunction Object Class Total Percentage
Research and Related Activities, National Science Foundation (049-0100) General science and basic research Grants, subsidies, and contributions (41.0) $999,407 100%
Modified: 4/4/25