Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2205.02340
Cited By
Knowledge Distillation of Russian Language Models with Reduction of Vocabulary
4 May 2022
A. Kolesnikova
Yuri Kuratov
Vasily Konovalov
Mikhail Burtsev
VLM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Distillation of Russian Language Models with Reduction of Vocabulary"
7 / 7 papers shown
Title
Knowledge Distillation of Domain-adapted LLMs for Question-Answering in Telecom
Rishika Sen
Sujoy Roychowdhury
Sumit Soman
H. G. Ranjani
Srikhetra Mohanty
66
0
0
28 Apr 2025
Minimizing PLM-Based Few-Shot Intent Detectors
Haode Zhang
Xiao-Ming Wu
Albert Y. S. Lam
VLM
30
0
0
13 Jul 2024
Stolen Subwords: Importance of Vocabularies for Machine Translation Model Stealing
Vilém Zouhar
AAML
40
0
0
29 Jan 2024
A Family of Pretrained Transformer Language Models for Russian
Dmitry Zmitrovich
Alexander Abramov
Andrey Kalmykov
Maria Tikhonova
Ekaterina Taktasheva
...
Vitalii Kadulin
Sergey Markov
Tatiana Shavrina
Vladislav Mikhailov
Alena Fenogenova
28
26
0
19 Sep 2023
HAlf-MAsked Model for Named Entity Sentiment analysis
A. Kabaev
P. Podberezko
A. Kaznacheev
Sabina Abdullayeva
9
3
0
30 Aug 2023
Monolingual and Cross-Lingual Knowledge Transfer for Topic Classification
D. Karpov
Mikhail Burtsev
19
2
0
13 Jun 2023
SberQuAD -- Russian Reading Comprehension Dataset: Description and Analysis
Pavel Efimov
Andrey Chertok
Leonid Boytsov
Pavel Braslavski
60
59
0
20 Dec 2019
1