ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.15778
  4. Cited By
Contextual BERT: Conditioning the Language Model Using a Global State

Contextual BERT: Conditioning the Language Model Using a Global State

29 October 2020
Timo I. Denk
Ana Peleteiro Ramallo
ArXivPDFHTML

Papers citing "Contextual BERT: Conditioning the Language Model Using a Global State"

3 / 3 papers shown
Title
CUE Vectors: Modular Training of Language Models Conditioned on Diverse
  Contextual Signals
CUE Vectors: Modular Training of Language Models Conditioned on Diverse Contextual Signals
Scott Novotney
Sreeparna Mukherjee
Zeeshan Ahmed
A. Stolcke
19
5
0
16 Mar 2022
KARL-Trans-NER: Knowledge Aware Representation Learning for Named Entity
  Recognition using Transformers
KARL-Trans-NER: Knowledge Aware Representation Learning for Named Entity Recognition using Transformers
Avi Chawla
Nidhi Mulay
Vikas Bishnoi
Gaurav Dhama
ViT
19
2
0
30 Nov 2021
Big Bird: Transformers for Longer Sequences
Big Bird: Transformers for Longer Sequences
Manzil Zaheer
Guru Guruganesh
Kumar Avinava Dubey
Joshua Ainslie
Chris Alberti
...
Philip Pham
Anirudh Ravula
Qifan Wang
Li Yang
Amr Ahmed
VLM
285
2,017
0
28 Jul 2020
1