ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.01185
  4. Cited By
ContraCLM: Contrastive Learning For Causal Language Model

ContraCLM: Contrastive Learning For Causal Language Model

3 October 2022
Nihal Jain
Dejiao Zhang
Wasi Uddin Ahmad
Zijian Wang
Feng Nan
Xiaopeng Li
Ming Tan
Ramesh Nallapati
Baishakhi Ray
Parminder Bhatia
Xiaofei Ma
Bing Xiang
ArXivPDFHTML

Papers citing "ContraCLM: Contrastive Learning For Causal Language Model"

11 / 11 papers shown
Title
Layer Swapping for Zero-Shot Cross-Lingual Transfer in Large Language Models
Layer Swapping for Zero-Shot Cross-Lingual Transfer in Large Language Models
Lucas Bandarkar
Benjamin Muller
Pritish Yuvraj
Rui Hou
Nayan Singhal
Hongjiang Lv
Bing-Quan Liu
KELM
LRM
MoMe
35
3
0
02 Oct 2024
Predicting the Target Word of Game-playing Conversations using a Low-Rank Dialect Adapter for Decoder Models
Predicting the Target Word of Game-playing Conversations using a Low-Rank Dialect Adapter for Decoder Models
Dipankar Srirag
Aditya Joshi
Jacob Eisenstein
44
1
0
31 Aug 2024
Heterogeneous Contrastive Learning for Foundation Models and Beyond
Heterogeneous Contrastive Learning for Foundation Models and Beyond
Lecheng Zheng
Baoyu Jing
Zihao Li
Hanghang Tong
Jingrui He
VLM
26
19
0
30 Mar 2024
CoNT: Contrastive Neural Text Generation
CoNT: Contrastive Neural Text Generation
Chen An
Jiangtao Feng
Kai Lv
Lingpeng Kong
Xipeng Qiu
Xuanjing Huang
68
23
0
29 May 2022
Virtual Augmentation Supported Contrastive Learning of Sentence
  Representations
Virtual Augmentation Supported Contrastive Learning of Sentence Representations
Dejiao Zhang
Wei Xiao
Henghui Zhu
Xiaofei Ma
Andrew O. Arnold
SSL
44
29
0
16 Oct 2021
ESimCSE: Enhanced Sample Building Method for Contrastive Learning of
  Unsupervised Sentence Embedding
ESimCSE: Enhanced Sample Building Method for Contrastive Learning of Unsupervised Sentence Embedding
Xing Wu
Chaochen Gao
Liangjun Zang
Jizhong Han
Zhongyuan Wang
Songlin Hu
SSL
AILaw
31
129
0
09 Sep 2021
CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for
  Code Understanding and Generation
CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation
Yue Wang
Weishi Wang
Shafiq R. Joty
S. Hoi
210
1,489
0
02 Sep 2021
Deduplicating Training Data Makes Language Models Better
Deduplicating Training Data Makes Language Models Better
Katherine Lee
Daphne Ippolito
A. Nystrom
Chiyuan Zhang
Douglas Eck
Chris Callison-Burch
Nicholas Carlini
SyDa
237
590
0
14 Jul 2021
WIT: Wikipedia-based Image Text Dataset for Multimodal Multilingual
  Machine Learning
WIT: Wikipedia-based Image Text Dataset for Multimodal Multilingual Machine Learning
Krishna Srinivasan
K. Raman
Jiecao Chen
Michael Bendersky
Marc Najork
VLM
197
310
0
02 Mar 2021
COCO-LM: Correcting and Contrasting Text Sequences for Language Model
  Pretraining
COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Yu Meng
Chenyan Xiong
Payal Bajaj
Saurabh Tiwary
Paul N. Bennett
Jiawei Han
Xia Song
119
202
0
16 Feb 2021
CodeXGLUE: A Machine Learning Benchmark Dataset for Code Understanding
  and Generation
CodeXGLUE: A Machine Learning Benchmark Dataset for Code Understanding and Generation
Shuai Lu
Daya Guo
Shuo Ren
Junjie Huang
Alexey Svyatkovskiy
...
Nan Duan
Neel Sundaresan
Shao Kun Deng
Shengyu Fu
Shujie Liu
ELM
196
853
0
09 Feb 2021
1