ResearchTrend.AI
  • Papers
  • Communities
  • Organizations
  • Events
  • Blog
  • Pricing
  • Feedback
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2508.13009
32
0

Matrix-Game 2.0: An Open-Source, Real-Time, and Streaming Interactive World Model

18 August 2025
Xianglong He
Chunli Peng
Zexiang Liu
Boyang Wang
Yifan Zhang
Qi Cui
Fei Kang
Biao Jiang
Mengyin An
Y. Ren
Baixin Xu
Hao Guo
Kaixiong Gong
Cyrus Wu
Wei Li
Xuchen Song
Teli Ma
Eric Li
Yahui Zhou
    VGen
ArXiv (abs)PDFHTMLHuggingFace (22 upvotes)
Main:15 Pages
19 Figures
Bibliography:4 Pages
4 Tables
Abstract

Recent advances in interactive video generations have demonstrated diffusion model's potential as world models by capturing complex physical dynamics and interactive behaviors. However, existing interactive world models depend on bidirectional attention and lengthy inference steps, severely limiting real-time performance. Consequently, they are hard to simulate real-world dynamics, where outcomes must update instantaneously based on historical context and current actions. To address this, we present Matrix-Game 2.0, an interactive world model generates long videos on-the-fly via few-step auto-regressive diffusion. Our framework consists of three key components: (1) A scalable data production pipeline for Unreal Engine and GTA5 environments to effectively produce massive amounts (about 1200 hours) of video data with diverse interaction annotations; (2) An action injection module that enables frame-level mouse and keyboard inputs as interactive conditions; (3) A few-step distillation based on the casual architecture for real-time and streaming video generation. Matrix Game 2.0 can generate high-quality minute-level videos across diverse scenes at an ultra-fast speed of 25 FPS. We open-source our model weights and codebase to advance research in interactive world modeling.

View on arXiv
Comments on this paper