ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2403.00849
  4. Cited By
NeuraLUT: Hiding Neural Network Density in Boolean Synthesizable
  Functions
v1v2 (latest)

NeuraLUT: Hiding Neural Network Density in Boolean Synthesizable Functions

29 February 2024
Marta Andronic
George A. Constantinides
ArXiv (abs)PDFHTML

Papers citing "NeuraLUT: Hiding Neural Network Density in Boolean Synthesizable Functions"

2 / 2 papers shown
TreeLUT: An Efficient Alternative to Deep Neural Networks for Inference Acceleration Using Gradient Boosted Decision Trees
TreeLUT: An Efficient Alternative to Deep Neural Networks for Inference Acceleration Using Gradient Boosted Decision TreesSymposium on Field Programmable Gate Arrays (FPGA), 2025
Alireza Khataei
Kia Bazargan
191
17
0
02 Jan 2025
Differentiable Weightless Neural Networks
Differentiable Weightless Neural NetworksInternational Conference on Machine Learning (ICML), 2024
Alan T. L. Bacellar
Zachary Susskind
Mauricio Breternitz Jr.
E. John
L. John
P. Lima
F. M. G. França
597
21
0
14 Oct 2024
1