Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2207.13988
Cited By
Sequence to sequence pretraining for a less-resourced Slovenian language
28 July 2022
Matej Ulčar
Marko Robnik-Šikonja
AIMat
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Sequence to sequence pretraining for a less-resourced Slovenian language"
6 / 6 papers shown
Title
Revisiting non-English Text Simplification: A Unified Multilingual Benchmark
Michael Joseph Ryan
Tarek Naous
Wei-ping Xu
24
24
0
25 May 2023
Unified Question Answering in Slovene
Katja Logar
Marko Robnik-Šikonja
19
0
0
16 Nov 2022
IT5: Text-to-text Pretraining for Italian Language Understanding and Generation
Gabriele Sarti
Malvina Nissim
AILaw
10
42
0
07 Mar 2022
Mengzi: Towards Lightweight yet Ingenious Pre-trained Models for Chinese
Zhuosheng Zhang
Hanqing Zhang
Keming Chen
Yuhang Guo
Jingyun Hua
Yulong Wang
Ming Zhou
VLM
44
70
0
13 Oct 2021
AraT5: Text-to-Text Transformers for Arabic Language Generation
El Moatez Billah Nagoudi
AbdelRahim Elmadany
Muhammad Abdul-Mageed
84
118
0
31 Aug 2021
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,956
0
20 Apr 2018
1