52

On the Information Capacity of Nearest Neighbor Representations

International Symposium on Information Theory (ISIT), 2023
Abstract

The von Neumann Computer Architecture\textit{von Neumann Computer Architecture} has a distinction between computation and memory. In contrast, the brain has an integrated architecture where computation and memory are indistinguishable. Motivated by the architecture of the brain, we propose a model of associative computation\textit{associative computation} where memory is defined by a set of vectors in Rn\mathbb{R}^n (that we call anchors\textit{anchors}), computation is performed by convergence from an input vector to a nearest neighbor anchor, and the output is a label associated with an anchor. Specifically, in this paper, we study the representation of Boolean functions in the associative computation model, where the inputs are binary vectors and the corresponding outputs are the labels (00 or 11) of the nearest neighbor anchors. The information capacity of a Boolean function in this model is associated with two quantities: (i)\textit{(i)} the number of anchors (called Nearest Neighbor (NN) Complexity\textit{Nearest Neighbor (NN) Complexity}) and (ii)\textit{(ii)} the maximal number of bits representing entries of anchors (called Resolution\textit{Resolution}). We study symmetric Boolean functions and present constructions that have optimal NN complexity and resolution.

View on arXiv
Comments on this paper