ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1907.03748
  4. Cited By
Learning Neural Sequence-to-Sequence Models from Weak Feedback with
  Bipolar Ramp Loss

Learning Neural Sequence-to-Sequence Models from Weak Feedback with Bipolar Ramp Loss

6 July 2019
Laura Jehl
Carolin (Haas) Lawrence
Stefan Riezler
ArXivPDFHTML

Papers citing "Learning Neural Sequence-to-Sequence Models from Weak Feedback with Bipolar Ramp Loss"

6 / 6 papers shown
Title
Coarse-Tuning Models of Code with Reinforcement Learning Feedback
Coarse-Tuning Models of Code with Reinforcement Learning Feedback
Abhinav C. P. Jain
Chima Adiole
Swarat Chaudhuri
Thomas W. Reps
Chris Jermaine Rice University
ALM
22
2
0
25 May 2023
Modelling Latent Translations for Cross-Lingual Transfer
Modelling Latent Translations for Cross-Lingual Transfer
E. Ponti
Julia Kreutzer
Ivan Vulić
Siva Reddy
32
18
0
23 Jul 2021
Summary Level Training of Sentence Rewriting for Abstractive
  Summarization
Summary Level Training of Sentence Rewriting for Abstractive Summarization
Sanghwan Bae
Taeuk Kim
Jihoon Kim
Sang-goo Lee
30
68
0
19 Sep 2019
Improving a Neural Semantic Parser by Counterfactual Learning from Human
  Bandit Feedback
Improving a Neural Semantic Parser by Counterfactual Learning from Human Bandit Feedback
Carolin (Haas) Lawrence
Stefan Riezler
OffRL
173
56
0
03 May 2018
Classical Structured Prediction Losses for Sequence to Sequence Learning
Classical Structured Prediction Losses for Sequence to Sequence Learning
Sergey Edunov
Myle Ott
Michael Auli
David Grangier
MarcÁurelio Ranzato
AIMat
56
185
0
14 Nov 2017
Neural versus Phrase-Based Machine Translation Quality: a Case Study
Neural versus Phrase-Based Machine Translation Quality: a Case Study
L. Bentivogli
Arianna Bisazza
Mauro Cettolo
Marcello Federico
191
328
0
16 Aug 2016
1