ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.22441
5
0

Can NeRFs See without Cameras?

28 May 2025
Chaitanya Amballa
Sattwik Basu
Yu-Lin Wei
Zhijian Yang
Mehmet Ergezer
Romit Roy Choudhury
ArXivPDFHTML
Abstract

Neural Radiance Fields (NeRFs) have been remarkably successful at synthesizing novel views of 3D scenes by optimizing a volumetric scene function. This scene function models how optical rays bring color information from a 3D object to the camera pixels. Radio frequency (RF) or audio signals can also be viewed as a vehicle for delivering information about the environment to a sensor. However, unlike camera pixels, an RF/audio sensor receives a mixture of signals that contain many environmental reflections (also called "multipath"). Is it still possible to infer the environment using such multipath signals? We show that with redesign, NeRFs can be taught to learn from multipath signals, and thereby "see" the environment. As a grounding application, we aim to infer the indoor floorplan of a home from sparse WiFi measurements made at multiple locations inside the home. Although a difficult inverse problem, our implicitly learnt floorplans look promising, and enables forward applications, such as indoor signal prediction and basic ray tracing.

View on arXiv
@article{amballa2025_2505.22441,
  title={ Can NeRFs See without Cameras? },
  author={ Chaitanya Amballa and Sattwik Basu and Yu-Lin Wei and Zhijian Yang and Mehmet Ergezer and Romit Roy Choudhury },
  journal={arXiv preprint arXiv:2505.22441},
  year={ 2025 }
}
Comments on this paper