Search Prime Grants

2335511

Cooperative Agreement

Overview

Grant Description
Sbir Phase II: Low Latency and Ultra-High Quality 360 Video Streaming Platform for Highly Immersive VR Experiences -The broader/commercial impact of this SBIR Phase II project is the first video streaming solution providing ultra-high-quality video with low latency for virtual/augmented/extended reality (VR/AR/XR) applications. Such applications are revolutionizing innumerable fields, with a projected market volume of $10.3 billion in 2024 in the U.S.

Yet, in reality the potential of VR/AR/XR remains untapped due to a remarkable limitation of existing video streaming solutions: high-quality videos entail huge latencies of 30-50 seconds, while low latency is only achieved with low quality video. However, both are important factors in VR/AR/XR, since high video quality is essential to generate a truly immersive experience for users, and low latency is essential to allow for live virtual interaction.

The proposed technology promises to overcome this problem and unleash the full potential of VR/AR/XR, through advanced applications in education, business, healthcare, defense, leisure, and more. This proprietary technology is the result of years of R&D, making it difficult for competitors to match its superior performance. If successful, this technology will represent a key commercial milestone for the proposing company, with forecasted annual revenues of $74.5 million by year three of commercialization, through a business model based on pay-per-use fees.

This small business innovation research Phase II project seeks to develop a video streaming solution that can manage 360? videos with ultra-high quality (>8K at a minimum of 60 fps) at low latencies (<1 second), enabling seamlessly switching between multiple cameras and audio tracks. The trade-off between video quality and streaming latency presents the main barrier for truly immersive and interactive virtual/augmented/extended reality (VR/AR/XR) experiences.

Thus, this project?s research objectives are focused on: 1) extending the existing Android solution to iOS devices; 2) low latency composed experiences, supporting synchronized content from multiple cameras; 3) supporting multiple audio tracks within one stream; 4) platform validation, benchmarking various cloud service instances; 5) documenting the workflow and establishing a procedure for third party operators to deploy the end-to-end system; 6) determining graphic processing units that meet the systems performance requirements; and 7) extending the platform to new generation XR devices. By the end of Phase II, the goal is to provide a platform that enables high-resolution 360? video streaming with latencies of less than two seconds, demonstrating sufficient maturity in each of its components, and to have clearly defined the hardware specifications that are required to provide optimal AR/VR/XR experiences.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.- Subawards are not planned for this award.
Awardee
Funding Goals
THE GOAL OF THIS FUNDING OPPORTUNITY, "NSF SMALL BUSINESS INNOVATION RESEARCH PHASE II (SBIR)/ SMALL BUSINESS TECHNOLOGY TRANSFER (STTR) PROGRAMS PHASE II", IS IDENTIFIED IN THE LINK: HTTPS://WWW.NSF.GOV/PUBLICATIONS/PUB_SUMM.JSP?ODS_KEY=NSF23516
Awarding / Funding Agency
Place of Performance
San Jose, California 95121-1052 United States
Geographic Scope
Single Zip Code
Yerba Buena Vr was awarded Cooperative Agreement 2335511 worth $998,251 from National Science Foundation in May 2024 with work to be completed primarily in San Jose California United States. The grant has a duration of 2 years and was awarded through assistance program 47.084 NSF Technology, Innovation, and Partnerships. The Cooperative Agreement was awarded through grant opportunity NSF Small Business Innovation Research / Small Business Technology Transfer Phase II Programs (SBIR/STTR Phase II).

SBIR Details

Research Type
SBIR Phase II
Title
SBIR Phase II: Low latency and ultra-high quality 360 video streaming platform for highly immersive VR experiences
Abstract
The broader/commercial impact of this SBIR Phase II project is the first video streaming solution providing ultra-high-quality video with low latency for Virtual/Augmented/Extended Reality (VR/AR/XR) applications. Such applications are revolutionizing innumerable fields, with a projected market volume of $10.3 Billion in 2024 in the U.S. Yet, in reality the potential of VR/AR/XR remains untapped due to a remarkable limitation of existing video streaming solutions: high-quality videos entail huge latencies of 30-50 seconds, while low latency is only achieved with low quality video. However, both are important factors in VR/AR/XR, since high video quality is essential to generate a truly immersive experience for users, and low latency is essential to allow for live virtual interaction. The proposed technology promises to overcome this problem and unleash the full potential of VR/AR/XR, through advanced applications in education, business, healthcare, defense, leisure, and more. This proprietary technology is the result of years of R&D, making it difficult for competitors to match its superior performance. If successful, this technology will represent a key commercial milestone for the proposing company, with forecasted annual revenues of $74.5 million by year three of commercialization, through a business model based on pay-per-use fees. This Small Business Innovation Research Phase II project seeks to develop a video streaming solution that can manage 360° videos with ultra-high quality (>8K at a minimum of 60 fps) at low latencies (<1 second), enabling seamlessly switching between multiple cameras and audio tracks. The trade-off between video quality and streaming latency presents the main barrier for truly immersive and interactive Virtual/Augmented/Extended Reality (VR/AR/XR) experiences. Thus, this project’s research objectives are focused on: 1) extending the existing Android solution to iOS devices; 2) low latency composed experiences, supporting synchronized content from multiple cameras; 3) supporting multiple audio tracks within one stream; 4) platform validation, benchmarking various cloud service instances; 5) documenting the workflow and establishing a procedure for third party operators to deploy the end-to-end system; 6) determining Graphic Processing Units that meet the systems performance requirements; and 7) extending the platform to new generation XR devices. By the end of Phase II, the goal is to provide a platform that enables high-resolution 360° video streaming with latencies of less than two seconds, demonstrating sufficient maturity in each of its components, and to have clearly defined the hardware specifications that are required to provide optimal AR/VR/XR experiences. This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Topic Code
AI
Solicitation Number
NSF 23-516

Status
(Ongoing)

Last Modified 5/6/24

Period of Performance
5/1/24
Start Date
4/30/26
End Date
71.0% Complete

Funding Split
$998.3K
Federal Obligation
$0.0
Non-Federal Obligation
$998.3K
Total Obligated
100.0% Federal Funding
0.0% Non-Federal Funding

Activity Timeline

Interactive chart of timeline of amendments to 2335511

Additional Detail

Award ID FAIN
2335511
SAI Number
None
Award ID URI
SAI EXEMPT
Awardee Classifications
Small Business
Awarding Office
491503 TRANSLATIONAL IMPACTS
Funding Office
491503 TRANSLATIONAL IMPACTS
Awardee UEI
YLDSMNLQMR19
Awardee CAGE
93GK4
Performance District
CA-19
Senators
Dianne Feinstein
Alejandro Padilla
Modified: 5/6/24