65

Efficient Sampled Softmax for Tensorflow

Abstract

This short paper discusses an efficient implementation of \emph{sampled softmax loss} for Tensorflow. The speedup over the default implementation is achieved due to simplification of the graph for the forward and backward passes.

View on arXiv
Comments on this paper