19
0

Neural 360^\circ Structured Light with Learned Metasurfaces

Abstract

Structured light has proven instrumental in 3D imaging, LiDAR, and holographic light projection. Metasurfaces, comprised of sub-wavelength-sized nanostructures, facilitate 180^\circ field-of-view (FoV) structured light, circumventing the restricted FoV inherent in traditional optics like diffractive optical elements. However, extant metasurface-facilitated structured light exhibits sub-optimal performance in downstream tasks, due to heuristic pattern designs such as periodic dots that do not consider the objectives of the end application. In this paper, we present neural 360^\circ structured light, driven by learned metasurfaces. We propose a differentiable framework, that encompasses a computationally-efficient 180^\circ wave propagation model and a task-specific reconstructor, and exploits both transmission and reflection channels of the metasurface. Leveraging a first-order optimizer within our differentiable framework, we optimize the metasurface design, thereby realizing neural 360^\circ structured light. We have utilized neural 360^\circ structured light for holographic light projection and 3D imaging. Specifically, we demonstrate the first 360^\circ light projection of complex patterns, enabled by our propagation model that can be computationally evaluated 50,000×\times faster than the Rayleigh-Sommerfeld propagation. For 3D imaging, we improve depth-estimation accuracy by 5.09×\times in RMSE compared to the heuristically-designed structured light. Neural 360^\circ structured light promises robust 360^\circ imaging and display for robotics, extended-reality systems, and human-computer interactions.

View on arXiv
Comments on this paper