ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.20278
70
0

The cell as a token: high-dimensional geometry in language models and cell embeddings

26 March 2025
William Gilpin
ArXivPDFHTML
Abstract

Single-cell sequencing technology maps cells to a high-dimensional space encoding their internal activity. This process mirrors parallel developments in machine learning, where large language models ingest unstructured text by converting words into discrete tokens embedded within a high-dimensional vector space. This perspective explores how advances in understanding the structure of language embeddings can inform ongoing efforts to analyze and visualize single cell datasets. We discuss how the context of tokens influences the geometry of embedding space, and the role of low-dimensional manifolds in shaping this space's robustness and interpretability. We highlight new developments in language modeling, such as interpretability probes and in-context reasoning, that can inform future efforts to construct and consolidate cell atlases.

View on arXiv
@article{gilpin2025_2503.20278,
  title={ The cell as a token: high-dimensional geometry in language models and cell embeddings },
  author={ William Gilpin },
  journal={arXiv preprint arXiv:2503.20278},
  year={ 2025 }
}
Comments on this paper