ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1606.01700
  4. Cited By
Gated Word-Character Recurrent Language Model
v1v2 (latest)

Gated Word-Character Recurrent Language Model

Conference on Empirical Methods in Natural Language Processing (EMNLP), 2016
6 June 2016
Yasumasa Onoe
Dong Wang
    RALMKELM
ArXiv (abs)PDFHTML

Papers citing "Gated Word-Character Recurrent Language Model"

37 / 37 papers shown
One Small Step for Generative AI, One Giant Leap for AGI: A Complete
  Survey on ChatGPT in AIGC Era
One Small Step for Generative AI, One Giant Leap for AGI: A Complete Survey on ChatGPT in AIGC Era
Chaoning Zhang
Chenshuang Zhang
Chenghao Li
Yu Qiao
Sheng Zheng
...
Sung-Ho Bae
Lik-Hang Lee
Pan Hui
In So Kweon
Choong Seon Hong
LM&MAAI4MHLRMELM
206
152
0
04 Apr 2023
An Overview on Language Models: Recent Developments and Outlook
An Overview on Language Models: Recent Developments and OutlookAPSIPA Transactions on Signal and Information Processing (TASIP), 2023
Chengwei Wei
Yun Cheng Wang
Bin Wang
C.-C. Jay Kuo
276
53
0
10 Mar 2023
LiteLSTM Architecture Based on Weights Sharing for Recurrent Neural
  Networks
LiteLSTM Architecture Based on Weights Sharing for Recurrent Neural NetworksInternational Journal of Computer Applications (IJCA), 2023
Nelly Elsayed
Zag ElSayed
Anthony Maida
152
2
0
12 Jan 2023
LiteLSTM Architecture for Deep Recurrent Neural Networks
LiteLSTM Architecture for Deep Recurrent Neural NetworksInternational Symposium on Circuits and Systems (ISCAS), 2022
Nelly Elsayed
Zag ElSayed
Anthony Maida
160
5
0
27 Jan 2022
Recurrent Neural Network from Adder's Perspective: Carry-lookahead RNN
Recurrent Neural Network from Adder's Perspective: Carry-lookahead RNNNeural Networks (NN), 2021
Haowei Jiang
Fei-wei Qin
Jin Cao
Yong Peng
Yanli Shao
LRMODL
143
53
0
22 Jun 2021
Zero-Shot Clinical Acronym Expansion via Latent Meaning Cells
Zero-Shot Clinical Acronym Expansion via Latent Meaning Cells
Griffin Adams
Mert Ketenci
Shreyas Bhave
A. Perotte
Noémie Elhadad
BDL
217
0
0
29 Sep 2020
Restoring ancient text using deep learning: a case study on Greek
  epigraphy
Restoring ancient text using deep learning: a case study on Greek epigraphyConference on Empirical Methods in Natural Language Processing (EMNLP), 2019
Yannis Assael
Thea Sommerschield
J. Prag
216
72
0
14 Oct 2019
Subword Language Model for Query Auto-Completion
Subword Language Model for Query Auto-CompletionConference on Empirical Methods in Natural Language Processing (EMNLP), 2019
Gyuwan Kim
137
17
0
02 Sep 2019
An Unsupervised Character-Aware Neural Approach to Word and Context
  Representation Learning
An Unsupervised Character-Aware Neural Approach to Word and Context Representation LearningInternational Conference on Artificial Neural Networks (ICANN), 2018
G. Marra
Andrea Zugarini
S. Melacci
Marco Maggini
SSL
174
14
0
19 Jul 2019
A Survey on Neural Network Language Models
A Survey on Neural Network Language Models
Kun Jing
Jungang Xu
176
58
0
09 Jun 2019
Gating Mechanisms for Combining Character and Word-level Word
  Representations: An Empirical Study
Gating Mechanisms for Combining Character and Word-level Word Representations: An Empirical Study
Jorge A. Balazs
Y. Matsuo
AI4CENAI
148
3
0
11 Apr 2019
COCO_TS Dataset: Pixel-level Annotations Based on Weak Supervision for
  Scene Text Segmentation
COCO_TS Dataset: Pixel-level Annotations Based on Weak Supervision for Scene Text Segmentation
Andrea Zugarini
S. Melacci
Monica Bianchini
Marco Maggini
231
38
0
01 Apr 2019
Effective Subword Segmentation for Text Comprehension
Effective Subword Segmentation for Text ComprehensionIEEE/ACM Transactions on Audio Speech and Language Processing (TASLP), 2018
Zhuosheng Zhang
Antonio Lieto
Kangwei Ling
Jiangtong Li
Z. Li
Shexia He
Guohong Fu
239
29
0
06 Nov 2018
Trellis Networks for Sequence Modeling
Trellis Networks for Sequence Modeling
Shaojie Bai
J. Zico Kolter
V. Koltun
202
159
0
15 Oct 2018
Character-Aware Decoder for Translation into Morphologically Rich
  Languages
Character-Aware Decoder for Translation into Morphologically Rich Languages
Adithya Renduchintala
Pamela Shapiro
Kevin Duh
Philipp Koehn
AI4CE
207
5
0
06 Sep 2018
Effective Character-augmented Word Embedding for Machine Reading
  Comprehension
Effective Character-augmented Word Embedding for Machine Reading Comprehension
Zhuosheng Zhang
Yafang Huang
Peng Fei Zhu
Hai Zhao
RALM
153
17
0
07 Aug 2018
Question-Aware Sentence Gating Networks for Question and Answering
Question-Aware Sentence Gating Networks for Question and Answering
Minjeong Kim
D. Park
Hyungjong Noh
Yeonsoo Lee
Jaegul Choo
128
0
0
20 Jul 2018
Subword-augmented Embedding for Cloze Reading Comprehension
Subword-augmented Embedding for Cloze Reading Comprehension
Zhuosheng Zhang
Yafang Huang
Zhao Hai
RALM
207
32
0
24 Jun 2018
Character-based Neural Networks for Sentence Pair Modeling
Character-based Neural Networks for Sentence Pair Modeling
Wuwei Lan
Wei Xu
74
17
0
21 May 2018
Numeracy for Language Models: Evaluating and Improving their Ability to
  Predict Numbers
Numeracy for Language Models: Evaluating and Improving their Ability to Predict Numbers
Georgios P. Spithourakis
Sebastian Riedel
159
86
0
21 May 2018
Dynamic Meta-Embeddings for Improved Sentence Representations
Dynamic Meta-Embeddings for Improved Sentence RepresentationsConference on Empirical Methods in Natural Language Processing (EMNLP), 2018
Douwe Kiela
Changhan Wang
Dong Wang
AI4TS
236
110
0
21 Apr 2018
An Empirical Evaluation of Generic Convolutional and Recurrent Networks
  for Sequence Modeling
An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling
Shaojie Bai
J. Zico Kolter
V. Koltun
DRL
360
5,879
0
04 Mar 2018
Slim Embedding Layers for Recurrent Neural Language Models
Slim Embedding Layers for Recurrent Neural Language Models
Zhongliang Li
Raymond Kulhanek
Shaojun Wang
Yunxin Zhao
Shuang Wu
KELM
148
23
0
27 Nov 2017
Syllable-level Neural Language Model for Agglutinative Language
Syllable-level Neural Language Model for Agglutinative Language
Seunghak Yu
Nilesh Kulkarni
Haejun Lee
J. Kim
72
13
0
18 Aug 2017
Strawman: an Ensemble of Deep Bag-of-Ngrams for Sentiment Analysis
Strawman: an Ensemble of Deep Bag-of-Ngrams for Sentiment Analysis
Dong Wang
83
1
0
26 Jul 2017
Do Convolutional Networks need to be Deep for Text Classification ?
Do Convolutional Networks need to be Deep for Text Classification ?
Hoa T. Le
Christophe Cerisara
Alexandre Denis
VLM
193
107
0
13 Jul 2017
An Embedded Deep Learning based Word Prediction
An Embedded Deep Learning based Word Prediction
Seunghak Yu
Nilesh Kulkarni
Haejun Lee
J. Kim
110
0
0
06 Jul 2017
From Characters to Words to in Between: Do We Capture Morphology?
From Characters to Words to in Between: Do We Capture Morphology?
Clara Vania
Adam Lopez
152
100
0
26 Apr 2017
Character-Word LSTM Language Models
Character-Word LSTM Language Models
Lyan Verwimp
J. Pelemans
Hugo Van hamme
P. Wambacq
126
54
0
10 Apr 2017
SyntaxNet Models for the CoNLL 2017 Shared Task
SyntaxNet Models for the CoNLL 2017 Shared Task
Chris Alberti
D. Andor
Ivan Bogatyy
Michael Collins
D. Gillick
...
Mark Omernick
Slav Petrov
C. Thanapirom
Zora Tung
David J. Weiss
3DV
208
33
0
15 Mar 2017
Attending to Characters in Neural Sequence Labeling Models
Attending to Characters in Neural Sequence Labeling Models
Marek Rei
Gamal K. O. Crichton
S. Pyysalo
173
192
0
14 Nov 2016
Words or Characters? Fine-grained Gating for Reading Comprehension
Words or Characters? Fine-grained Gating for Reading Comprehension
Zhilin Yang
Bhuwan Dhingra
Ye Yuan
Junjie Hu
William W. Cohen
Ruslan Salakhutdinov
AI4CE
195
101
0
06 Nov 2016
A Joint Many-Task Model: Growing a Neural Network for Multiple NLP Tasks
A Joint Many-Task Model: Growing a Neural Network for Multiple NLP Tasks
Kazuma Hashimoto
Caiming Xiong
Yoshimasa Tsuruoka
R. Socher
KELM
444
585
0
05 Nov 2016
Nonsymbolic Text Representation
Nonsymbolic Text Representation
Hinrich Schütze
Heike Adel
Ehsaneddin Asgari
MILM
305
19
0
03 Oct 2016
Character-Level Language Modeling with Hierarchical Recurrent Neural
  Networks
Character-Level Language Modeling with Hierarchical Recurrent Neural Networks
Kyuyeon Hwang
Wonyong Sung
177
67
0
13 Sep 2016
Using the Output Embedding to Improve Language Models
Using the Output Embedding to Improve Language ModelsConference of the European Chapter of the Association for Computational Linguistics (EACL), 2016
Ofir Press
Lior Wolf
374
773
0
20 Aug 2016
Higher Order Recurrent Neural Networks
Higher Order Recurrent Neural Networks
Rohollah Soltani
Hui Jiang
166
62
0
30 Apr 2016
1