164

Neural Networks Use Distance Metrics

Main:8 Pages
4 Figures
Bibliography:2 Pages
4 Tables
Appendix:2 Pages
Abstract

We present empirical evidence that neural networks with ReLU and Absolute Value activations learn distance-based representations. We independently manipulate both distance and intensity properties of internal activations in trained models, finding that both architectures are highly sensitive to small distance-based perturbations while maintaining robust performance under large intensity-based perturbations. These findings challenge the prevailing intensity-based interpretation of neural network activations and offer new insights into their learning and decision-making processes.

View on arXiv
Comments on this paper