Image-Net: Discriminatory Imaging and Network Advancement for Missles, Aviation, and Space:3D Vision Team

Grants and Contracts Details

Description

The over-arching objective of Image-Net is to demonstrate elevated war-fighter preparedness through enhanced battle management imaging technology, real-time mission preparation, and advanced mission training and rehearsal methods that are based on fused sensor data, special purpose data transport, improved system engineering techniques and immersive 3DSG environments. This proposal conducts research, development, tests and demonstrations that enhances the technology and systems required for the collection of spatial data over wide areas acquired from distributed, heterogeneous sensor packages that are carried as payloads on mobile platforms; the need to fuse 2D, 3D, real-time full motion video (FMV) and archival data into higher-level models; the construction of information to support applications such as training and mission rehearsal, with stringent requirements for success; and the requirements process for development of complex systems to support time-constrained decision-making. This initiative, conducted by the University of Kentucky (UK) Team consisting of the UK College of Engineering’s Center for Visualization and Virtual Environments and Radiance Technologies, will address major technological areas related by the central theme of threedimensional scene generation (3DSG) including: (1) Two- and three-dimensional information gathering, modeling and scene generation; (2) Immersive display environments; (3) Mobile sensor platforms: air (UAV) and ground; (4) Software requirements generation, analysis, and tool-based decision-making; and (5) Network protocols, distributed service oriented architectures (SOA), and system simulations under performance constraints, and (6) Application of research prototypes to military scenarios. The scope of work outlines tasks and related subtasks for a base period and four options, each approximately 12 months. The base year establishes an advanced test bed environment that will integrate TRL-4 level technologies with existing data management, networking and Full Motion Video tools such as the Space and Missile Defense Command (USASMDC) Future Warfare Center’s (FWC) Advanced Warfare Environment (AWarE). This technology advancement provides a virtual and interactive training environment that can be tested and exercised by the Joint Training Counter Improvised Explosive Device (IED) Operations Integration Center (JTCOIC) as a government independent evaluator. Option One builds on the base year archived data training prototype by expanding the ability to introduce near real time and networked data into the immersive 3D environment, enabling the conduct of realistic and time relevant mission rehearsal for battle staffs and deployed forces. Option Two advances and matures Image-Net research technology, prototypes and test bed functionality for sensor fusion, real-time data capture and control, and enhanced network performance. Option Three develops and tests imagery and scene generation predictive analysis and responsive courses of action development toolkit. Option Four extends Image-Net research results to advance visualization technologies via intelligent decision aids, imagery predictive analysis, and ad hoc global networks management. Dr. Brent Seales, Director of UK’s Visualization Center is the Image-Net Principal Investigator (PI) with the responsibility and commensurate authority for all facets of this project. Working directly for the Dean, College of Engineering, Dr. Seales will appoint task leads with the appropriate knowledge, experience and resources; he will perform management and fiscal control by day-to-day interactions, weekly reports of accomplishments and issues, formal monthly meetings and, detailed quarterly in-process reviews.
StatusFinished
Effective start/end date9/26/119/30/12

Funding

  • Army Space and Missile Defense Command

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.