39
1

Synthetic Categorical Restructuring large Or How AIs Gradually Extract Efficient Regularities from Their Experience of the World

Abstract

How do language models segment their internal experience of the world of words to progressively learn to interact with it more efficiently? This study in the neuropsychology of artificial intelligence investigates the phenomenon of synthetic categorical restructuring, a process through which each successive perceptron neural layer abstracts and combines relevant categorical sub-dimensions from the thought categories of its previous layer. This process shapes new, even more efficient categories for analyzing and processing the synthetic system's own experience of the linguistic external world to which it is exposed. Our genetic neuron viewer, associated with this study, allows visualization of the synthetic categorical restructuring phenomenon occurring during the transition from perceptron layer 0 to 1 in GPT2-XL.

View on arXiv
@article{pichat2025_2503.10643,
  title={ Synthetic Categorical Restructuring large Or How AIs Gradually Extract Efficient Regularities from Their Experience of the World },
  author={ Michael Pichat and William Pogrund and Paloma Pichat and Armanouche Gasparian and Samuel Demarchi and Martin Corbet and Alois Georgeon and Theo Dasilva and Michael Veillet-Guillem },
  journal={arXiv preprint arXiv:2503.10643},
  year={ 2025 }
}
Comments on this paper