ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.21001
40
0

Towards Lossless Implicit Neural Representation via Bit Plane Decomposition

28 February 2025
Woo Kyoung Han
Byeonghun Lee
Hyunmin Cho
Sunghoon Im
Kyong Hwan Jin
    MQ
ArXivPDFHTML
Abstract

We quantify the upper bound on the size of the implicit neural representation (INR) model from a digital perspective. The upper bound of the model size increases exponentially as the required bit-precision increases. To this end, we present a bit-plane decomposition method that makes INR predict bit-planes, producing the same effect as reducing the upper bound of the model size. We validate our hypothesis that reducing the upper bound leads to faster convergence with constant model size. Our method achieves lossless representation in 2D image and audio fitting, even for high bit-depth signals, such as 16-bit, which was previously unachievable. We pioneered the presence of bit bias, which INR prioritizes as the most significant bit (MSB). We expand the application of the INR task to bit depth expansion, lossless image compression, and extreme network quantization. Our source code is available atthis https URL

View on arXiv
@article{han2025_2502.21001,
  title={ Towards Lossless Implicit Neural Representation via Bit Plane Decomposition },
  author={ Woo Kyoung Han and Byeonghun Lee and Hyunmin Cho and Sunghoon Im and Kyong Hwan Jin },
  journal={arXiv preprint arXiv:2502.21001},
  year={ 2025 }
}
Comments on this paper