ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2511.18659
176
0
v1v2 (latest)

CLaRa: Bridging Retrieval and Generation with Continuous Latent Reasoning

24 November 2025
Jie He
Richard He Bai
Sinead Williamson
Jeff Z. Pan
Navdeep Jaitly
Yizhe Zhang
    RALMVLMLRM
ArXiv (abs)PDFHTMLHuggingFace (5 upvotes)Github (9★)
Main:11 Pages
14 Figures
Bibliography:4 Pages
13 Tables
Appendix:26 Pages
Abstract

Retrieval-augmented generation (RAG) enhances large language models (LLMs) with external knowledge but still suffers from long contexts and disjoint retrieval-generation optimization. In this work, we propose CLaRa (Continuous Latent Reasoning), a unified framework that performs embedding-based compression and joint optimization in a shared continuous space. To obtain semantically rich and retrievable compressed vectors, we introduce SCP, a key-preserving data synthesis framework using QA and paraphrase supervision. CLaRa then trains the reranker and generator end-to-end via a single language modeling loss, with gradients flowing through both modules using a differentiable top-k estimator. Theoretically, this unified optimization aligns retrieval relevance with answer quality. Experiments across multiple QA benchmarks show that CLaRa achieves state-of-the-art compression and reranking performance, often surpassing text-based fine-tuned baselines.

View on arXiv
Comments on this paper