Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2006.04229
Cited By
Pre-training Polish Transformer-based Language Models at Scale
7 June 2020
Slawomir Dadas
Michal Perelkiewicz
Rafal Poswiata
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Pre-training Polish Transformer-based Language Models at Scale"
5 / 5 papers shown
Title
Punctuation Prediction for Polish Texts using Transformers
Jakub Pokrywka
21
0
0
06 Oct 2024
Machine Generated Text: A Comprehensive Survey of Threat Models and Detection Methods
Evan Crothers
Nathalie Japkowicz
H. Viktor
DeLMO
25
107
0
13 Oct 2022
SlovakBERT: Slovak Masked Language Model
Matúš Pikuliak
Stefan Grivalsky
Martin Konopka
Miroslav Blšták
Martin Tamajka
Viktor Bachratý
Marián Simko
Pavol Balázik
Michal Trnka
Filip Uhlárik
27
25
0
30 Sep 2021
HerBERT: Efficiently Pretrained Transformer-based Language Model for Polish
Robert Mroczkowski
Piotr Rybak
Alina Wróblewska
Ireneusz Gawlik
25
81
0
04 May 2021
PhoBERT: Pre-trained language models for Vietnamese
Dat Quoc Nguyen
A. Nguyen
167
341
0
02 Mar 2020
1