ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.04981
  4. Cited By
TRACE: TRansformer-based Attribution using Contrastive Embeddings in
  LLMs

TRACE: TRansformer-based Attribution using Contrastive Embeddings in LLMs

6 July 2024
Cheng Wang
Xinyang Lu
S. Ng
Bryan Kian Hsiang Low
ArXivPDFHTML

Papers citing "TRACE: TRansformer-based Attribution using Contrastive Embeddings in LLMs"

4 / 4 papers shown
Title
WASA: WAtermark-based Source Attribution for Large Language
  Model-Generated Data
WASA: WAtermark-based Source Attribution for Large Language Model-Generated Data
Jingtan Wang
Xinyang Lu
Zitong Zhao
Zhongxiang Dai
Chuan-Sheng Foo
See-Kiong Ng
K. H. Low
WaLM
57
14
0
01 Oct 2023
Training language models to follow instructions with human feedback
Training language models to follow instructions with human feedback
Long Ouyang
Jeff Wu
Xu Jiang
Diogo Almeida
Carroll L. Wainwright
...
Amanda Askell
Peter Welinder
Paul Christiano
Jan Leike
Ryan J. Lowe
OSLM
ALM
308
11,915
0
04 Mar 2022
COCO-LM: Correcting and Contrasting Text Sequences for Language Model
  Pretraining
COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Yu Meng
Chenyan Xiong
Payal Bajaj
Saurabh Tiwary
Paul N. Bennett
Jiawei Han
Xia Song
125
202
0
16 Feb 2021
Efficient Estimation of Word Representations in Vector Space
Efficient Estimation of Word Representations in Vector Space
Tomáš Mikolov
Kai Chen
G. Corrado
J. Dean
3DV
228
31,253
0
16 Jan 2013
1