29

Topology-Aware Activation Functions in Neural Networks

Pavel Snopov
Oleg R. Musin
Main:5 Pages
2 Figures
Bibliography:1 Pages
1 Tables
Abstract

This study explores novel activation functions that enhance the ability of neural networks to manipulate data topology during training. Building on the limitations of traditional activation functions like ReLU\mathrm{ReLU}, we propose SmoothSplit\mathrm{SmoothSplit} and ParametricSplit\mathrm{ParametricSplit}, which introduce topology "cutting" capabilities. These functions enable networks to transform complex data manifolds effectively, improving performance in scenarios with low-dimensional layers. Through experiments on synthetic and real-world datasets, we demonstrate that ParametricSplit\mathrm{ParametricSplit} outperforms traditional activations in low-dimensional settings while maintaining competitive performance in higher-dimensional ones. Our findings highlight the potential of topology-aware activation functions in advancing neural network architectures. The code is available viathis https URL.

View on arXiv
Comments on this paper