ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.13280
142
0
v1v2v3 (latest)

Expressive power of binary and ternary neural networks

27 June 2022
A. Beknazaryan
    MQ
ArXiv (abs)PDFHTML
Abstract

We show that deep sparse ReLU networks with ternary weights and deep ReLU networks with binary weights can approximate β\betaβ-H\"older functions on [0,1]d[0,1]^d[0,1]d. Also, continuous functions on [0,1]d[0,1]^d[0,1]d can be approximated by networks of depth 222 with binary activation function \mathds1[0,1)\mathds{1}_{[0,1)}\mathds1[0,1)​.

View on arXiv
Comments on this paper