ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.16151
  4. Cited By
Understanding the Capabilities of Large Language Models for Automated
  Planning

Understanding the Capabilities of Large Language Models for Automated Planning

25 May 2023
Vishal Pallagani
Bharath Muppasani
K. Murugesan
F. Rossi
Biplav Srivastava
L. Horesh
F. Fabiano
Andrea Loreggia
    LLMAG
    ELM
ArXivPDFHTML

Papers citing "Understanding the Capabilities of Large Language Models for Automated Planning"

6 / 6 papers shown
Title
Neurosymbolic AI for Enhancing Instructability in Generative AI
Neurosymbolic AI for Enhancing Instructability in Generative AI
Amit P. Sheth
Vishal Pallagani
Kaushik Roy
LLMAG
21
2
0
26 Jul 2024
Unlocking Large Language Model's Planning Capabilities with Maximum Diversity Fine-tuning
Unlocking Large Language Model's Planning Capabilities with Maximum Diversity Fine-tuning
Wenjun Li
Changyu Chen
Pradeep Varakantham
40
2
0
15 Jun 2024
Open-Endedness is Essential for Artificial Superhuman Intelligence
Open-Endedness is Essential for Artificial Superhuman Intelligence
Edward Hughes
Michael Dennis
Jack Parker-Holder
Feryal M. P. Behbahani
Aditi Mavalankar
Yuge Shi
Tom Schaul
Tim Rocktaschel
LRM
32
18
0
06 Jun 2024
ProgPrompt: Generating Situated Robot Task Plans using Large Language
  Models
ProgPrompt: Generating Situated Robot Task Plans using Large Language Models
Ishika Singh
Valts Blukis
Arsalan Mousavian
Ankit Goyal
Danfei Xu
Jonathan Tremblay
D. Fox
Jesse Thomason
Animesh Garg
LM&Ro
LLMAG
112
619
0
22 Sep 2022
Training language models to follow instructions with human feedback
Training language models to follow instructions with human feedback
Long Ouyang
Jeff Wu
Xu Jiang
Diogo Almeida
Carroll L. Wainwright
...
Amanda Askell
Peter Welinder
Paul Christiano
Jan Leike
Ryan J. Lowe
OSLM
ALM
303
11,881
0
04 Mar 2022
CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for
  Code Understanding and Generation
CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation
Yue Wang
Weishi Wang
Shafiq R. Joty
S. Hoi
210
1,485
0
02 Sep 2021
1