ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2512.06065
164
0

EgoEdit: Dataset, Real-Time Streaming Model, and Benchmark for Egocentric Video Editing

5 December 2025
Runjia Li
Moayed Haji-Ali
Ashkan Mirzaei
Chaoyang Wang
Arpit Sahni
Ivan Skorokhodov
Aliaksandr Siarohin
Tomas Jakab
Junlin Han
Sergey Tulyakov
Philip Torr
Willi Menapace
    EgoV
ArXiv (abs)PDFHTMLHuggingFace (26 upvotes)
Main:8 Pages
13 Figures
Bibliography:4 Pages
7 Tables
Appendix:10 Pages
Abstract

We study instruction-guided editing of egocentric videos for interactive AR applications. While recent AI video editors perform well on third-person footage, egocentric views present unique challenges - including rapid egomotion and frequent hand-object interactions - that create a significant domain gap. Moreover, existing offline editing pipelines suffer from high latency, limiting real-time interaction. To address these issues, we present a complete ecosystem for egocentric video editing. First, we construct EgoEditData, a carefully designed and manually curated dataset specifically designed for egocentric editing scenarios, featuring rich hand-object interactions, while explicitly preserving hands. Second, we develop EgoEdit, an instruction-following egocentric video editor that supports real-time streaming inference on a single GPU. Finally, we introduce EgoEditBench, an evaluation suite targeting instruction faithfulness, hand and interaction preservation, and temporal stability under egomotion. Across both egocentric and general editing tasks, EgoEdit produces temporally stable, instruction-faithful results with interactive latency. It achieves clear gains on egocentric editing benchmarks-where existing methods struggle-while maintaining performance comparable to the strongest baselines on general editing tasks. EgoEditData and EgoEditBench will be made public for the research community. See our website atthis https URL

View on arXiv
Comments on this paper