ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.03707
  4. Cited By
What Should Embeddings Embed? Autoregressive Models Represent Latent
  Generating Distributions

What Should Embeddings Embed? Autoregressive Models Represent Latent Generating Distributions

6 June 2024
Liyi Zhang
Michael Y. Li
Thomas L. Griffiths
ArXivPDFHTML

Papers citing "What Should Embeddings Embed? Autoregressive Models Represent Latent Generating Distributions"

3 / 3 papers shown
Title
How Do Transformers Learn Topic Structure: Towards a Mechanistic
  Understanding
How Do Transformers Learn Topic Structure: Towards a Mechanistic Understanding
Yuchen Li
Yuan-Fang Li
Andrej Risteski
107
61
0
07 Mar 2023
Probing Classifiers: Promises, Shortcomings, and Advances
Probing Classifiers: Promises, Shortcomings, and Advances
Yonatan Belinkov
224
402
0
24 Feb 2021
What you can cram into a single vector: Probing sentence embeddings for
  linguistic properties
What you can cram into a single vector: Probing sentence embeddings for linguistic properties
Alexis Conneau
Germán Kruszewski
Guillaume Lample
Loïc Barrault
Marco Baroni
199
879
0
03 May 2018
1