ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2301.11688
  4. Cited By
Can We Use Probing to Better Understand Fine-tuning and Knowledge
  Distillation of the BERT NLU?

Can We Use Probing to Better Understand Fine-tuning and Knowledge Distillation of the BERT NLU?

27 January 2023
Jakub Ho'scilowicz
Marcin Sowanski
Piotr Czubowski
Artur Janicki
ArXivPDFHTML

Papers citing "Can We Use Probing to Better Understand Fine-tuning and Knowledge Distillation of the BERT NLU?"

2 / 2 papers shown
Title
Non-Linear Inference Time Intervention: Improving LLM Truthfulness
Non-Linear Inference Time Intervention: Improving LLM Truthfulness
Jakub Hoscilowicz
Adam Wiacek
Jan Chojnacki
Adam Cieślak
Leszek Michon
Vitalii Urbanevych
Artur Janicki
KELM
22
1
0
27 Mar 2024
The Rediscovery Hypothesis: Language Models Need to Meet Linguistics
The Rediscovery Hypothesis: Language Models Need to Meet Linguistics
Vassilina Nikoulina
Maxat Tezekbayev
Nuradil Kozhakhmet
Madina Babazhanova
Matthias Gallé
Z. Assylbekov
29
7
0
02 Mar 2021
1