ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.13511
  4. Cited By
RobBERTje: a Distilled Dutch BERT Model

RobBERTje: a Distilled Dutch BERT Model

28 April 2022
Pieter Delobelle
Thomas Winters
Bettina Berendt
ArXivPDFHTML

Papers citing "RobBERTje: a Distilled Dutch BERT Model"

3 / 3 papers shown
Title
FairDistillation: Mitigating Stereotyping in Language Models
FairDistillation: Mitigating Stereotyping in Language Models
Pieter Delobelle
Bettina Berendt
20
8
0
10 Jul 2022
Optimal Subarchitecture Extraction For BERT
Optimal Subarchitecture Extraction For BERT
Adrian de Wynter
Daniel J. Perry
MQ
43
18
0
20 Oct 2020
What the [MASK]? Making Sense of Language-Specific BERT Models
What the [MASK]? Making Sense of Language-Specific BERT Models
Debora Nozza
Federico Bianchi
Dirk Hovy
82
105
0
05 Mar 2020
1