ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.04543
  4. Cited By
Strengthening Structural Inductive Biases by Pre-training to Perform
  Syntactic Transformations

Strengthening Structural Inductive Biases by Pre-training to Perform Syntactic Transformations

5 July 2024
Matthias Lindemann
Alexander Koller
Ivan Titov
    AI4CE
    NAI
ArXivPDFHTML

Papers citing "Strengthening Structural Inductive Biases by Pre-training to Perform Syntactic Transformations"

5 / 5 papers shown
Title
How Does Code Pretraining Affect Language Model Task Performance?
How Does Code Pretraining Affect Language Model Task Performance?
Jackson Petty
Sjoerd van Steenkiste
Tal Linzen
60
8
0
06 Sep 2024
SLOG: A Structural Generalization Benchmark for Semantic Parsing
SLOG: A Structural Generalization Benchmark for Semantic Parsing
Bingzhi Li
L. Donatelli
Alexander Koller
Tal Linzen
Yuekun Yao
Najoung Kim
27
14
0
23 Oct 2023
Learning Algebraic Recombination for Compositional Generalization
Learning Algebraic Recombination for Compositional Generalization
Chenyao Liu
Shengnan An
Zeqi Lin
Qian Liu
Bei Chen
Jian-Guang Lou
Lijie Wen
Nanning Zheng
Dongmei Zhang
CoGe
186
36
0
14 Jul 2021
Trankit: A Light-Weight Transformer-based Toolkit for Multilingual
  Natural Language Processing
Trankit: A Light-Weight Transformer-based Toolkit for Multilingual Natural Language Processing
Minh Nguyen
Viet Dac Lai
Amir Pouran Ben Veyseh
Thien Huu Nguyen
44
132
0
09 Jan 2021
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Chelsea Finn
Pieter Abbeel
Sergey Levine
OOD
260
11,677
0
09 Mar 2017
1