ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2205.06733
  4. Cited By
Arithmetic-Based Pretraining -- Improving Numeracy of Pretrained
  Language Models

Arithmetic-Based Pretraining -- Improving Numeracy of Pretrained Language Models

13 May 2022
Dominic Petrak
N. Moosavi
Iryna Gurevych
    AIMat
ArXivPDFHTML

Papers citing "Arithmetic-Based Pretraining -- Improving Numeracy of Pretrained Language Models"

2 / 2 papers shown
Title
MathBERT: A Pre-Trained Model for Mathematical Formula Understanding
MathBERT: A Pre-Trained Model for Mathematical Formula Understanding
Shuai Peng
Ke Yuan
Liangcai Gao
Zhi Tang
AIMat
41
104
0
02 May 2021
Google's Neural Machine Translation System: Bridging the Gap between
  Human and Machine Translation
Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation
Yonghui Wu
M. Schuster
Z. Chen
Quoc V. Le
Mohammad Norouzi
...
Alex Rudnick
Oriol Vinyals
G. Corrado
Macduff Hughes
J. Dean
AIMat
716
6,740
0
26 Sep 2016
1