Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2002.01145
Cited By
v1
v2 (latest)
Syntactically Look-Ahead Attention Network for Sentence Compression
AAAI Conference on Artificial Intelligence (AAAI), 2020
4 February 2020
Hidetaka Kamigaito
Manabu Okumura
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Syntactically Look-Ahead Attention Network for Sentence Compression"
6 / 6 papers shown
Considering Length Diversity in Retrieval-Augmented Summarization
North American Chapter of the Association for Computational Linguistics (NAACL), 2025
Juseon-Do
Jaesung Hwang
Jingun Kwon
Hidetaka Kamigaito
Manabu Okumura
264
1
0
12 Mar 2025
InstructCMP: Length Control in Sentence Compression through Instruction-based Large Language Models
Juseon-Do
Jingun Kwon
Hidetaka Kamigaito
Manabu Okumura
238
3
0
16 Jun 2024
From Lengthy to Lucid: A Systematic Literature Review on NLP Techniques for Taming Long Sentences
Tatiana Passali
Efstathios Chatzikyriakidis
Stelios Andreadis
Thanos G. Stavropoulos
Anastasia Matonaki
A. Fachantidis
Grigorios Tsoumakas
240
2
0
08 Dec 2023
Revision for Concision: A Constrained Paraphrase Generation Task
Wenchuan Mu
Kwanin Lim
280
4
0
25 Oct 2022
Contextualized Semantic Distance between Highly Overlapped Texts
Letian Peng
Z. Li
Hai Zhao
253
2
0
04 Oct 2021
Evaluation Discrepancy Discovery: A Sentence Compression Case-study
Yevgeniy Puzikov
80
0
0
22 Jan 2021
1
Page 1 of 1