Grants and Contracts Details
This proposal presents a collaborative research and education effort in Visual and Experiential Computing to impart superhuman visual enhancement on the user by means of a hand-held spectral imager. This research can be characterized by our two principal goals. The first goal of the research program is to develop novel techniques in compressive sensing that facilitate the integration of Coded Aperture Snapshot Spectral Imaging (CASSI) in the visible and near-IR range with a commodity time-of-flight image sensor. If successful, our proposed methods will represent a significant improvement over existing RGBD cameras that integrate two separate image sensors (one for RGB and a separate sensor for depth) because it will use only a single sensor and expand the observable wavelength range from the visible to near-IR as well as increase the number of color primaries to as many as 64 independent channels. And it will increase the resolution of the depth sensor to match the DMD array, which can be anywhere from 4x to 54x the resolution of the TOF sensor. In order to mitigate the computational complexity of CS inverse image reconstruction algorithms that often preclude their use in real-time implementations, we intend to build on our previous work at developing coded aperture schemes that give rise to divide-and-conquer image reconstruction algorithms that can be implemented on GPU and other multi-core processors. This is a major component to this proposal. The second goal of the research program is to design and assemble a proof-of-concept prototype camera that integrates a DMD array with a CMOS, TOF image sensor in a package comparable to a pico-projector light engine. Such a system, derived from the modification of a commercially available, inter-oral dental scanner, could be mass produced and interface with a commodity smart phone, allowing for its use in both novel and mundane ways.
|Effective start/end date||9/1/15 → 8/31/19|
- National Science Foundation: $484,952.00
Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.