ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2201.05411
  4. Cited By
Eliciting Knowledge from Pretrained Language Models for Prototypical
  Prompt Verbalizer

Eliciting Knowledge from Pretrained Language Models for Prototypical Prompt Verbalizer

14 January 2022
Yinyi Wei
Tong Mo
Yong-jia Jiang
Weiping Li
Wen Zhao
    VLM
ArXivPDFHTML

Papers citing "Eliciting Knowledge from Pretrained Language Models for Prototypical Prompt Verbalizer"

4 / 4 papers shown
Title
Rethinking Visual Prompt Learning as Masked Visual Token Modeling
Rethinking Visual Prompt Learning as Masked Visual Token Modeling
Ning Liao
Bowen Shi
Xiaopeng Zhang
Min Cao
Junchi Yan
Qi Tian
VLM
8
7
0
09 Mar 2023
WARP: Word-level Adversarial ReProgramming
WARP: Word-level Adversarial ReProgramming
Karen Hambardzumyan
Hrant Khachatrian
Jonathan May
AAML
243
340
0
01 Jan 2021
Exploiting Cloze Questions for Few Shot Text Classification and Natural
  Language Inference
Exploiting Cloze Questions for Few Shot Text Classification and Natural Language Inference
Timo Schick
Hinrich Schütze
248
1,382
0
21 Jan 2020
Efficient Estimation of Word Representations in Vector Space
Efficient Estimation of Word Representations in Vector Space
Tomáš Mikolov
Kai Chen
G. Corrado
J. Dean
3DV
228
29,632
0
16 Jan 2013
1