ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2303.17786
  4. Cited By
Attention is Not Always What You Need: Towards Efficient Classification
  of Domain-Specific Text

Attention is Not Always What You Need: Towards Efficient Classification of Domain-Specific Text

31 March 2023
Yasmen Wahba
N. Madhavji
John Steinbacher
ArXivPDFHTML

Papers citing "Attention is Not Always What You Need: Towards Efficient Classification of Domain-Specific Text"

2 / 2 papers shown
Title
Let's Play Mono-Poly: BERT Can Reveal Words' Polysemy Level and
  Partitionability into Senses
Let's Play Mono-Poly: BERT Can Reveal Words' Polysemy Level and Partitionability into Senses
Aina Garí Soler
Marianna Apidianaki
MILM
193
67
0
29 Apr 2021
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
294
6,927
0
20 Apr 2018
1