Search Contract Opportunities

Mixed Reality Point Cloud Manipulation

ID: N251-033 • Type: SBIR / STTR Topic • Match:  85%
Opportunity Assistant

Hello! Please let me know your questions about this opportunity. I will answer based on the available opportunity documents.

Please sign-in to link federal registration and award history to assistant. Sign in to upload a capability statement or catalogue for your company

Some suggestions:
Please summarize the work to be completed under this opportunity
Do the documents mention an incumbent contractor?
Does this contract have any security clearance requirements?
I'd like to anonymously submit a question to the procurement officer(s)
Loading

Description

OUSD (R&E) CRITICAL TECHNOLOGY AREA(S): Sustainment OBJECTIVE: Develop a capability to visualize and modify 3-D point cloud models generated by Light Detection and Ranging (LiDAR) and photogrammetry with mixed reality hardware to improve the ability for engineers and technicians to perform virtual ship checks to support design, installation, and modernization to deliver ships on time at lower costs. DESCRIPTION: Program Executive Offices (PEOs), shipyards, Original Equipment Manufacturers (OEMs), Alteration Installation Teams (AITs), Regional Maintenance Centers (RMCs), and others perform countless ship checks and inspections throughout a ship's lifecycle. Investments are currently being made into creating dimensional digital twins with LiDAR, photogrammetry, and other 3-D scanning technologies. These technologies have proven invaluable for generating 3-D models that aid in various maintenance and sustainment functions throughout an asset's lifecycle, but the Navy does not have an effective environment for visualizing and collaborating in the review of ship models. 3-D model generators and consumers visit ships, submarines, or other physical objects of interest, 3-D scan the physical asset leveraging LiDAR or Photogrammetry technologies, generate a 3-D data model with point cloud software, and then view the 3-D model in a 2-D environment (typically a computer monitor) to support future 3-D work (example: installation and modernization). This approach limits user performance and fidelity relative to what fully 3-D models offer, and results in lower effectiveness in the use of this technology. Immersive 3-D native environments such as augmented reality (AR), virtual reality (VR), or holographic displays provide the opportunity to experience 3-D models in their native dimensions by allowing users to explore and visualize structures and components with every aspect of the model in a familiar and lifelike environment. This will allow naval architects, engineers, technicians, logisticians, shipyard workers, and others across the NAVSEA enterprise to gain significantly more value out of 3-D models with the ability to collaborate in real-time as if physically visiting the ship as a team. While specific use cases differ in application, the general improvements to visualization are of scale, proportions, special relationships, interferences, and overlays of technical data and annotations from previous inspection and work crews. All these factors will be invaluable to maintenance planning and coordination. Direct return on investments will be seen by improved detection and resolution of physical interferences, design flaws or conflicts, physical damage to equipment or platforms, or other issues with material condition over traditional 2-D renderings on computer screens. Finally, mixed reality will offer the ability for collaborative touring, viewing, diagnosis, and resolution if the aforementioned issues to help diverse teams resolve challenges significantly faster, but currently these tools are not yet mature enough for wide adoption. To improve the application, execution, and use of 3-D scanning technologies for shipyard applications, NAVSEA would greatly benefit from research, development, and transitioning of software tools that allow the exploration of models in full 3-D views. This concept of employment would be directly applicable to two primary user communities for design purposes: A) Ship-level inspections, issue documentation, and tagging which occurs on the deck plates of ships and are reviewed by both local and distributed engineering teams. Teams specifically inspect equipment for work and maintenance discrepancies (paint issues, corrosion, loose nuts, bolts, fittings, et al), which should be annotated, documented, and reported via Navy IT systems. In a 3-D environment those annotations can be made directly in a 3-D model environment to better correlate issue status with the specific physical location and piece of equipment of concern, and then models can be shared across multiple teams to maintain a single maintenance operations and maintenance picture. B) Long-term (multi-year) and short term (single year) modernization planning design work which occurs at the shipyard, at contractor offices, or at distributed engineering Navy laboratories. Engineers, architects, and technicians will take existing 3-D models and drawings, import CAD models for future installations and redesign, and look for interferences, poor condition of existing structures and materials, and will annotate corrections that need to be performed by other teams. A collaborative environment where these models can be viewed and toured by diverse teams to collaborate and rapidly resolve issues is critical, as is the ability to compare as-designed drawings to as-built and current condition models and take measurements inside of those models. PHASE I: Provide detailed workflows for ingesting 3-D point clouds into vendor software and hardware. Demonstrate similar capability using contractor provided data to assess feasibility. To support this, the government will provide detailed requirements for interaction functionality, data specifications and standards for government models (provided at contract award). The Phase I Option, if exercised, will include the initial design specifications, capabilities description, a preliminary timetable, and a budget to build a scaled prototype solution in Phase II. PHASE II: Demonstrate the ability to ingest, manipulate, and mark up 3D models of Navy-representative ships generated by the government, with annotations that can be shared across team-mates. Develop a full-scale prototype and complete a successful demonstration of the prototype's capabilities. PHASE III DUAL USE APPLICATIONS: Assist the Navy in transitioning this technology in the form of a fully operational system (premised on the Phase II prototype) to government use initially on DDG 51 class ships. The final product delivered at the end of Phase III will be an integrated hardware and software solution that can be used by any industry, academia, or government engineering or operations teams that can benefit from collaboration in 3-D space. This includes operations planning, construction and construction management, surveying, and any other use case with similar requirements. REFERENCES: 1. Wirth, Florian et al. "Pointatme: efficient 3d point cloud labeling in virtual reality." 2019 IEEE Intelligent Vehicles Symposium (IV). IEEE, 2019. 2. Evangelos, Alexiou; Yang, Nanyang and Ebrahimi, Touradj. "PointXR: A toolbox for visualization and subjective evaluation of point clouds in virtual reality." 2020 Twelfth International Conference on Quality of Multimedia Experience (QoMEX). IEEE, 2020. 3. Garrido, Daniel et al. "Point cloud interaction and manipulation in virtual reality." 2021 5th International Conference on Artificial Intelligence and Virtual Reality (AIVR). 4. Stets, Jonathan Dyssel et al. "Visualization and labeling of point clouds in virtual reality." SIGGRAPH Asia 2017 Posters, Article No. 31, pp. 1-2. 5. Maloca, Peter M, et al. "High-performance virtual reality volume rendering of original optical coherence tomography point-cloud data enhanced with real-time ray casting." Translational vision science & technology, Vol 7, 2, 2018. KEYWORDS: LiDAR; Photogrammetry; Point-Cloud; Mixed-Reality; Annotation; Virtual Ship Check

Overview

Response Deadline
Feb. 5, 2025 Past Due
Posted
Dec. 4, 2024
Open
Dec. 4, 2024
Set Aside
Small Business (SBA)
Place of Performance
Not Provided
Source
Alt Source

Program
SBIR Phase I / II
Structure
Contract
Phase Detail
Phase I: Establish the technical merit, feasibility, and commercial potential of the proposed R/R&D efforts and determine the quality of performance of the small business awardee organization.
Phase II: Continue the R/R&D efforts initiated in Phase I. Funding is based on the results achieved in Phase I and the scientific and technical merit and commercial potential of the project proposed in Phase II. Typically, only Phase I awardees are eligible for a Phase II award
Duration
6 Months - 1 Year
Size Limit
500 Employees
On 12/4/24 Department of the Navy issued SBIR / STTR Topic N251-033 for Mixed Reality Point Cloud Manipulation due 2/5/25.

Documents

Posted documents for SBIR / STTR Topic N251-033

Question & Answer

The AI Q&A Assistant has moved to the bottom right of the page

Contract Awards

Prime contracts awarded through SBIR / STTR Topic N251-033

Incumbent or Similar Awards

Potential Bidders and Partners

Awardees that have won contracts similar to SBIR / STTR Topic N251-033

Similar Active Opportunities

Open contract opportunities similar to SBIR / STTR Topic N251-033