ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2412.11556
  4. Cited By
Token Prepending: A Training-Free Approach for Eliciting Better Sentence
  Embeddings from LLMs

Token Prepending: A Training-Free Approach for Eliciting Better Sentence Embeddings from LLMs

16 December 2024
Yuchen Fu
Zifeng Cheng
Zhiwei Jiang
Zhonghui Wang
Yafeng Yin
Zhengliang Li
Qing Gu
    LLMAG
ArXivPDFHTML

Papers citing "Token Prepending: A Training-Free Approach for Eliciting Better Sentence Embeddings from LLMs"

Title
No papers