ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2409.18127
  4. Cited By
EgoLM: Multi-Modal Language Model of Egocentric Motions

EgoLM: Multi-Modal Language Model of Egocentric Motions

26 September 2024
Fangzhou Hong
Vladimir Guzov
Hyo Jin Kim
Yuting Ye
Richard A. Newcombe
Ziwei Liu
Lingni Ma
ArXivPDFHTML

Papers citing "EgoLM: Multi-Modal Language Model of Egocentric Motions"

1 / 1 papers shown
Title
Ego4o: Egocentric Human Motion Capture and Understanding from Multi-Modal Input
Ego4o: Egocentric Human Motion Capture and Understanding from Multi-Modal Input
Jian Wang
Rishabh Dabral
D. Luvizon
Zhe Cao
Lingjie Liu
Thabo Beeler
Christian Theobalt
EgoV
45
0
0
11 Apr 2025
1