ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2202.06499
  4. Cited By
Real World Large Scale Recommendation Systems Reproducibility and Smooth
  Activations

Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations

14 February 2022
G. Shamir
Dong Lin
    HAI
    OffRL
ArXivPDFHTML

Papers citing "Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations"

4 / 4 papers shown
Title
Anti-Distillation: Improving reproducibility of deep networks
Anti-Distillation: Improving reproducibility of deep networks
G. Shamir
Lorenzo Coviello
34
20
0
19 Oct 2020
TanhExp: A Smooth Activation Function with High Convergence Speed for
  Lightweight Neural Networks
TanhExp: A Smooth Activation Function with High Convergence Speed for Lightweight Neural Networks
Xinyu Liu
Xiaoguang Di
11
59
0
22 Mar 2020
Large scale distributed neural network training through online
  distillation
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
267
404
0
09 Apr 2018
Simple and Scalable Predictive Uncertainty Estimation using Deep
  Ensembles
Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles
Balaji Lakshminarayanan
Alexander Pritzel
Charles Blundell
UQCV
BDL
268
5,660
0
05 Dec 2016
1