4
0

HandOcc: NeRF-based Hand Rendering with Occupancy Networks

Abstract

We propose HandOcc, a novel framework for hand rendering based upon occupancy. Popular rendering methods such as NeRF are often combined with parametric meshes to provide deformable hand models. However, in doing so, such approaches present a trade-off between the fidelity of the mesh and the complexity and dimensionality of the parametric model. The simplicity of parametric mesh structures is appealing, but the underlying issue is that it binds methods to mesh initialization, making it unable to generalize to objects where a parametric model does not exist. It also means that estimation is tied to mesh resolution and the accuracy of mesh fitting. This paper presents a pipeline for meshless 3D rendering, which we apply to the hands. By providing only a 3D skeleton, the desired appearance is extracted via a convolutional model. We do this by exploiting a NeRF renderer conditioned upon an occupancy-based representation. The approach uses the hand occupancy to resolve hand-to-hand interactions further improving results, allowing fast rendering, and excellent hand appearance transfer. On the benchmark InterHand2.6M dataset, we achieved state-of-the-art results.

View on arXiv
@article{ivashechkin2025_2505.02079,
  title={ HandOcc: NeRF-based Hand Rendering with Occupancy Networks },
  author={ Maksym Ivashechkin and Oscar Mendez and Richard Bowden },
  journal={arXiv preprint arXiv:2505.02079},
  year={ 2025 }
}
Comments on this paper