ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.07416
  4. Cited By
LiteVLM: A Low-Latency Vision-Language Model Inference Pipeline for Resource-Constrained Environments
v1v2 (latest)

LiteVLM: A Low-Latency Vision-Language Model Inference Pipeline for Resource-Constrained Environments

9 June 2025
Jin Huang
Yuchao Jin
Le An
Josh Park
    VLM
ArXiv (abs)PDFHTML

Papers citing "LiteVLM: A Low-Latency Vision-Language Model Inference Pipeline for Resource-Constrained Environments"

2 / 2 papers shown
Efficient Onboard Vision-Language Inference in UAV-Enabled Low-Altitude Economy Networks via LLM-Enhanced Optimization
Efficient Onboard Vision-Language Inference in UAV-Enabled Low-Altitude Economy Networks via LLM-Enhanced Optimization
Yang Li
Ruichen Zhang
Yinqiu Liu
Guangyuan Liu
Zhu Han
Abbas Jamalipour
Xianbin Wang
Dong In Kim
140
0
0
11 Oct 2025
Nav-EE: Navigation-Guided Early Exiting for Efficient Vision-Language Models in Autonomous Driving
Nav-EE: Navigation-Guided Early Exiting for Efficient Vision-Language Models in Autonomous Driving
Haibo Hu
Lianming Huang
X. Wang
Yufei Cui
Shangyu Wu
Nan Guan
Chun Jason Xue
VLM
209
0
0
02 Oct 2025
1