Search SBIR/STTR Opportunities

Synthetic Vision System for Ground Forces

Type: SBIR • Topic: N171-091

Description

TECHNOLOGY AREA(S): Info Systems, Electronics, Human Systems
OBJECTIVE: This effort seeks to accelerate and enhance decision making for Marine Corps Ground Forces by developing a Synthetic Vision System (SVS) extension for head-mounted displays (HMDs) to provide training aids and situational awareness (SA) visualizations.
DESCRIPTION: As tactical decision making is pushed down to lower echelons, tools to support the development and operations of ground force small unit decision makers remains an open challenge [1]. For example, ground-based warfighters are required to make quick decisions about any number of situations encountered in the battlefield. To inform these decisions warfighters must learn about these situations and associated skills (e.g. call-for-fire training) and then access and process data during operations. User interfaces and data sources (e.g. tablets) that require taking eyes off training or operations limits the warfighters ability to learn and respond to changing conditions. Head-mounted displays (HMDs) coupled with the emergence of Augmented Reality (AR) Technologies [2] offer hands-free user interfaces that can provide training aids and situational awareness (SA) in contextual formats that could minimize cognitive load without losing sight of the battlefield.

AR-based HMD for ground forces is conceptually similar to existing technology used by aviators. For example, Synthetic Vision Systems (SVS) have been shown to improve terrain awareness and potential reductions in controlled flight into terrain accidents over existing SA cockpit technologies. Notwithstanding those benefits, challenges remain with the use of synthetic vision displays in aviation, particularly in managing the allocation of attention [3,4]. Innovation is needed to take the lessons learned from aviation and apply them to the development of a ground-based AR-based HMDs in a cost-effective (less than $1,500) and Infantry Marine-friendly configuration “ unobtrusive and not frustrating to end user to wear and operate. The focus of the proposed effort is on defining synthetic vision requirement specifications and functional prototypes for next-generation AR-based HMD technologies that provide operational training aids and SA decision-support and for ground based forces. The effort seeks advancements in visualizations to mitigate attention and perception limitations (e.g. attention tunneling) that have potential adverse effects on cognitive load. Visualization designs and prototypes should focus on two types of display configurations “ static and dynamic. Static information displays are persistent and dont change often regardless of context. Dynamic displays are non-persistent and information is displayed that aligns with specific contexts and tasks. Resulting specifications and proof-of-concepts for more-advanced AR-based HMD technologies will contribute to improvements in SVS design guidelines and recommendations.

Proposals must describe how information visualizations will address psychological and cognitive principles [5] and provide AR examples regarding representation of information [6]. Proposals, however, dont need to develop a complete AR system [2], but must clearly describe how they will investigate and evaluate the proposed visualizations. All developments and experiments should be done with simulation engines that have no or minimal licensing fees for development or run-time execution (e.g. Unity). The focus of training and operations of SA tools should focus on support for Marine Corps call-for-fire training and missions. Examples of information to be investigated and visualized includes: user heading, bearing, range, target designation information (i.e. symbols, designation box, attack geometry, risk/area effect size (such as range rings)), airspace control measures (i.e. holding areas, battle positions, initial points), and fire support control measures (i.e. no fire areas or restricted fire areas).
PHASE I: Define requirements and develop mock-ups and/or very early prototypes for advanced SVS information/data visualizations that enhance warfighter decision making and situational awareness as it relates to call for fire and close air support activities. Requirements definitions and mock-ups / prototypes must include: a description of domain and tasks, a determination of the fundamental cognitive theories and principles that will be used to define the SVS visualizations, associated Augmented Reality (AR) approaches or properties (e.g. temporal, physical, and perceptual), a detailed discussion of the design trade-offs as they relate to hardware and software capabilities (e.g. 2D vs 3D visualization, egocentric vs. allocentric registration, etc.), and description of proposed methods, metrics, and analysis for designing and evaluating proposed visualizations. In addition, Phase II plans should be provided, to include list of potential hardware and software that will be used to demonstrate proof of concept visualizations, critical technical milestones, and plans for testing and validating proposed data visualizations. Finally, Phase I should also include the processing and submission of any necessary human subjects research protocols for Phase II research.
PHASE II: Develop, demonstrate, and evaluate proof of concept SVS information/data visualizations based on preliminary design requirements generated in Phase I. Appropriate engineering testing will be performed, along with a critical design review and finalizing the design of proposed visualizations. Phase II deliverables will include: working proof of concept visualizations, specifications for their development, and demonstration, validation, and report of results showing capability of visualizations to support warfighter decision making and situational awareness as they relate to call for fire and close air support.
PHASE III: The performer will be expected to support the Marine Corps in transitioning the requirements and associated software products to support the development of Synthetic Vision System (SVS) training aids and situational awareness (SA) visualizations, The software products are expected to be used to include integrating and/or support Marine Corps training simulations systems (e.g. Augmented Immersive Team Trainer), and will require certifying and qualifying the system for Marine Corps use, delivering a Marine Corps design manual for the product, and providing Marine Corps system specification materials. Private Sector Commercial Potential: From a commercial perspective, the resulting design methods, principles, and proof of concept visualizations will be applicable to high risk/high demand work domains with large amounts of integrated information demands, such as law enforcement, emergency response, healthcare, and manufacturing. It is anticipated that the general findings of this effort will contribute broadly to our understanding of the design of AR information and data visualizations that will have broad implications relating to the implementation of AR interfaces outside of the military.
REFERENCES: 1. Naval Research Advisory Committee (2009.) Immersive Simulation for Marine Corps Small Unit Training. Retrieved 6 June 2016 from http://www.nrac.navy.mil/docs/2009_rpt_Immersive_Sim.pdf2. Schaffer, R., Cullen, S., Cerritelli, L., Kumar, R., Samarasekera,S., Sizintsev, M., Oskiper,T., Branzoi,V. (2015). Mobile Augmented Reality for Force-on-Force Training, in Proceeding of the Interservice/Industry Training, Simulation & Education Conference 2013. Arlington, VA: National Training and Simulation Association.3. Bailey, R. E. (2012). Awareness and detection of traffic and obstacles using synthetic and enhanced vision systems. Retrieved 6 June 2016 from http://ntrs.nasa.gov/search.jsp?R=201200013384. Wickens, C. D., & Alexander, A. L. (2009). Attentional tunneling and task management in synthetic vision displays. The International Journal of Aviation Psychology, 19(2), 182-199.5. Bennett, K. B., & Flach, J. M. (1992). Graphical displays: Implications for divided attention, focused attention, and problem solving. Human Factors: The Journal of the Human Factors and Ergonomics Society, 34(5), 513-533.6. Tönnis, M., Plecher, D. A., & Klinker, G. (2013). Representing information“Classifying the Augmented Reality presentation space. Computers & Graphics, 37(8), 997-1011.-
KEYWORDS: Augmented Reality (AR); Heads-up Display (HUD); Helmet-mounted Display (HMD); Decision Making; Synthetic Vision System (SVS); Attention

Overview

The Department Of The Navy announced SBIR Phase I/II titled Synthetic Vision System for Ground Forces on 11/30/16. Applications for topic N171-091 (2017) open on 01/10/17 and close on 02/08/17.

Program Details

Est. Value
$50,000 - $250,000 (Phase I) or $750,000 (Phase II)
Duration
6 Months - 1 Year
Size Limit
500 Employees

Awards

Contract and grant awards for topic N171-091 2017