OUSD (R&E) CRITICAL TECHNOLOGY AREA(S): Space Technology The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with the Announcement. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws. OBJECTIVE: The objective of this SBIR is to build and demonstrate data fusion algorithms which enable joint tracking between traditional imaging sensors and neuromorphic sensors. The resulting research should deliver data fusion algorithms that enable and inform search and track concept of operations while improving track fidelity in the temporal domain using the neuromorphic data and improving target discrimination using traditional imagers. DESCRIPTION: Neuromorphic or event-based sensors (EBS) offer an alternative sensing paradigm in which pixels operate asynchronously and only report scene dynamics in the form of (x,y,t,p) messages where x and y are pixel indices, t is a timestamp in microsecond precision and p is a polarity bit to indicate increasing or decreasing changes [1]. The result is a sensor with high temporal resolution in combination with significantly reduced power consumption and overall data reduction as compared to traditional high speed video. The ability to measure fleeting temporal dynamics at reduced power and computational cost has prompted a number of studies in leveraging this sensing technology for applications such as moving target engagement. This then requires methods by which neuromorphic sensing data streams can be integrated with and leveraged alongside traditional sensing modalities. This topic seeks solutions to demonstrate the data fusion of neuromorphic and framing data such that tracking information from framing imagers can be temporally upsampled through the use of event based data streams. Ultimately this should be implemented and deployed in a processing architecture which is well suited for onboard processing hardware which are naturally constrained by size weight and power. Although fused tracking data should be the end product the delivered software product, solutions should also include concept of operations considerations that enable geo-registration, clutter suppression, sensor tip & cue and track handoff. Finally the study should include trade space analysis to inform the ideal measurement cadence between the sensors and the ideal homography required to move between these modalities. PHASE I: Phase I research should consist of development and demonstration of candidate tracking algorithms which can independently be applied to frame based imaging data and event based sensing data respectively. Awardee(s) should also identify a viable route by which data fusion will occur in the phase II research. Proposed candidate algorithms should be computationally lightweight, have a viable path towards the intended data fusion pipeline and execution within onboard processing hardware. PHASE II: Phase II will take candidate algorithms identified in phase I and prove out the data fusion using real sensor data representative of overhead sensing concept of operations. This should involve working with co-sighted sensors in which the framing sensor field of view is much larger than the event based sensor field of view. Success of this data fusion will be measured by the ability to provide a continuous high fidelity track by temporally upsampling using the event based sensor data. Algorithms should also demonstrate the capability for improved target discrimination using intensity data from the framing imager. Finally, the processing pipeline used to detect, track and fuse will be integrated and run as a laboratory prototype which is identified as well suited for onboard processing. Performers will report tracking performance using annotated data to determine probability of detect and false alarm, sensitivity limits of the final prototype, temporal performance/real time capability and power requirements. Prototypes should utilize surrogate data to demonstrate real time tracking, data fusion and track products. Laboratory prototypes and associated software products will be delivered to the government technical point of contact and validated for potential field deployment using real time scene simulation and test sensors. PHASE III DUAL USE APPLICATIONS: Phase III will optimize and repackage the laboratory prototype for integration into an overhead field demonstration. The field demonstration will include ground truth targets and emulate a tasking, collection, processing, exploitation, and dissemination pipeline to assess the accuracy, latency and value add for warfighter application. This will validate the ability of the solution to be deployed and transitioned to a space platform. REFERENCES: 1. G. Gallego et al., "Event-Based Vision: A Survey," in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 44, no. 1, pp. 154-180, 1 Jan. 2022, doi: 10.1109/TPAMI.2020.3008413.; 2. Gehrig, D., Rebecq, H., Gallego, G., Scaramuzza, D. (2018). Asynchronous, Photometric Feature Tracking Using Events and Frames. In: Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y. (eds) Computer Vision ECCV 2018. ECCV 2018. Lecture Notes in Computer Science(), vol 11216. Springer, Cham. https://doi.org/10.1007/978-3-030-01258-8_46.; 3. Stepan Tulyakov, Alfredo Bochicchio, Daniel Gehrig, Stamatios Georgoulis, Yuanyou Li, Davide Scaramuzza. (2022) Time Lens++: Event-based Frame Interpolation with Parametric Non-linear Flow and Multi-scale Fusion. https://doi.org/10.48550/arXiv.2203.17191.; 4. T. Stoffregen, G. Gallego, T. Drummond, L. Kleeman and D. Scaramuzza, "Event-Based Motion Segmentation by Motion Compensation," 2019 IEEE/CVF International Conference on Computer Vision (ICCV), Seoul, Korea (South), 2019, pp. 7243-7252, doi: 10.1109/ICCV.2019.00734.; KEYWORDS: Neuromorphic; Even-Based Sensing; Computer Vision; Computation Imaging; Asynchronous Sensor; Real Time Processing; Remote Sensing; Tracking; On-Board Processing; Data Fusion; Algorithm Development