ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.11104
  4. Cited By
AttentionMix: Data augmentation method that relies on BERT attention
  mechanism

AttentionMix: Data augmentation method that relies on BERT attention mechanism

20 September 2023
Dominik Lewy
Jacek Mañdziuk
ArXivPDFHTML

Papers citing "AttentionMix: Data augmentation method that relies on BERT attention mechanism"

3 / 3 papers shown
Title
Enhancing elusive clues in knowledge learning by contrasting attention of language models
Enhancing elusive clues in knowledge learning by contrasting attention of language models
Jian Gao
Xiao Zhang
Ji Wu
Miao Li
38
0
0
26 Sep 2024
Why does Knowledge Distillation Work? Rethink its Attention and Fidelity
  Mechanism
Why does Knowledge Distillation Work? Rethink its Attention and Fidelity Mechanism
Chenqi Guo
Shiwei Zhong
Xiaofeng Liu
Qianli Feng
Yinglong Ma
22
1
0
30 Apr 2024
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,956
0
20 Apr 2018
1