ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.08983
  4. Cited By
Towards Interpreting BERT for Reading Comprehension Based QA

Towards Interpreting BERT for Reading Comprehension Based QA

18 October 2020
Sahana Ramnath
Preksha Nema
Deep Sahni
Mitesh M. Khapra
ArXivPDFHTML

Papers citing "Towards Interpreting BERT for Reading Comprehension Based QA"

2 / 2 papers shown
Title
MoEfication: Transformer Feed-forward Layers are Mixtures of Experts
MoEfication: Transformer Feed-forward Layers are Mixtures of Experts
Zhengyan Zhang
Yankai Lin
Zhiyuan Liu
Peng Li
Maosong Sun
Jie Zhou
MoE
13
115
0
05 Oct 2021
What all do audio transformer models hear? Probing Acoustic
  Representations for Language Delivery and its Structure
What all do audio transformer models hear? Probing Acoustic Representations for Language Delivery and its Structure
Jui Shah
Yaman Kumar Singla
Changyou Chen
R. Shah
17
81
0
02 Jan 2021
1