ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2103.07162
  4. Cited By
Is BERT a Cross-Disciplinary Knowledge Learner? A Surprising Finding of
  Pre-trained Models' Transferability
v1v2v3 (latest)

Is BERT a Cross-Disciplinary Knowledge Learner? A Surprising Finding of Pre-trained Models' Transferability

Conference on Empirical Methods in Natural Language Processing (EMNLP), 2021
12 March 2021
Wei-Tsung Kao
Hung-yi Lee
ArXiv (abs)PDFHTML

Papers citing "Is BERT a Cross-Disciplinary Knowledge Learner? A Surprising Finding of Pre-trained Models' Transferability"

5 / 5 papers shown
Pre-trained Language Model and Knowledge Distillation for Lightweight
  Sequential Recommendation
Pre-trained Language Model and Knowledge Distillation for Lightweight Sequential Recommendation
Li Li
Mingyue Cheng
Zhiding Liu
Hao Zhang
Qi Liu
Enhong Chen
VLM
232
1
0
23 Sep 2024
Learning Transferable Time Series Classifier with Cross-Domain
  Pre-training from Language Model
Learning Transferable Time Series Classifier with Cross-Domain Pre-training from Language ModelWeb Search and Data Mining (WSDM), 2024
Mingyue Cheng
Xiaoyu Tao
Qi Liu
Hao Zhang
Yiheng Chen
Chenyi Lei
AI4TS
240
2
0
19 Mar 2024
T5lephone: Bridging Speech and Text Self-supervised Models for Spoken
  Language Understanding via Phoneme level T5
T5lephone: Bridging Speech and Text Self-supervised Models for Spoken Language Understanding via Phoneme level T5IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2022
Chan-Jan Hsu
Ho-Lam Chung
Hung-yi Lee
Yu Tsao
291
6
0
01 Nov 2022
Linguistically inspired roadmap for building biologically reliable
  protein language models
Linguistically inspired roadmap for building biologically reliable protein language modelsNature Machine Intelligence (Nat. Mach. Intell.), 2022
Mai Ha Vu
Rahmad Akbar
Philippe A. Robert
B. Swiatczak
Victor Greiff
G. K. Sandve
Dag Trygve Tryslew Haug
285
45
0
03 Jul 2022
DUAL: Discrete Spoken Unit Adaptive Learning for Textless Spoken
  Question Answering
DUAL: Discrete Spoken Unit Adaptive Learning for Textless Spoken Question AnsweringInterspeech (Interspeech), 2022
Guan-Ting Lin
Yung-Sung Chuang
Ho-Lam Chung
Shu-Wen Yang
Hsuan-Jui Chen
Shuyan Dong
Shang-Wen Li
Abdel-rahman Mohamed
Hung-yi Lee
Lin-Shan Lee
319
25
0
09 Mar 2022
1