ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.06798
  4. Cited By
Reasoning Grasping via Multimodal Large Language Model

Reasoning Grasping via Multimodal Large Language Model

9 February 2024
Shiyu Jin
Jinxuan Xu
Yutian Lei
Liangjun Zhang
    LRM
ArXivPDFHTML

Papers citing "Reasoning Grasping via Multimodal Large Language Model"

5 / 5 papers shown
Title
AffordGrasp: In-Context Affordance Reasoning for Open-Vocabulary Task-Oriented Grasping in Clutter
Yingbo Tang
S. Zhang
Xiaoshuai Hao
Pengwei Wang
Jianlong Wu
Z. Wang
Shanghang Zhang
40
4
0
02 Mar 2025
A Parameter-Efficient Tuning Framework for Language-guided Object Grounding and Robot Grasping
A Parameter-Efficient Tuning Framework for Language-guided Object Grounding and Robot Grasping
Houjian Yu
Mingen Li
Alireza Rezazadeh
Yang Yang
Changhyun Choi
30
1
0
28 Sep 2024
HiFi-CS: Towards Open Vocabulary Visual Grounding For Robotic Grasping Using Vision-Language Models
HiFi-CS: Towards Open Vocabulary Visual Grounding For Robotic Grasping Using Vision-Language Models
V. Bhat
P. Krishnamurthy
Ramesh Karri
Farshad Khorrami
37
3
0
16 Sep 2024
Human-in-the-loop Robotic Grasping using BERT Scene Representation
Human-in-the-loop Robotic Grasping using BERT Scene Representation
Yaoxian Song
Penglei Sun
Pengfei Fang
Linyi Yang
Yanghua Xiao
Yue Zhang
52
5
0
28 Sep 2022
ProgPrompt: Generating Situated Robot Task Plans using Large Language
  Models
ProgPrompt: Generating Situated Robot Task Plans using Large Language Models
Ishika Singh
Valts Blukis
Arsalan Mousavian
Ankit Goyal
Danfei Xu
Jonathan Tremblay
D. Fox
Jesse Thomason
Animesh Garg
LM&Ro
LLMAG
104
616
0
22 Sep 2022
1