Phys4DGen: Physics-Compliant 4D Generation with Multi-Material Composition Perception

4D content generation aims to create dynamically evolving 3D content that responds to specific input objects such as images or 3D representations. Current approaches typically incorporate physical priors to animate 3D representations, but these methods suffer from significant limitations: they not only require users lacking physics expertise to manually specify material properties but also struggle to effectively handle the generation of multi-material composite objects. To address these challenges, we propose Phys4DGen, a novel 4D generation framework that integrates multi-material composition perception with physical simulation. The framework achieves automated, physically plausible 4D generation through three innovative modules: first, the 3D Material Grouping module partitions heterogeneous material regions on 3D representation surfaces via semantic segmentation; second, the Internal Physical Structure Discovery module constructs the mechanical structure of object interiors; finally, we distill physical prior knowledge from multimodal large language models to enable rapid and automatic material properties identification for both objects' surfaces and interiors. Experiments on both synthetic and real-world datasets demonstrate that Phys4DGen can generate high-fidelity 4D content with physical realism in open-world scenarios, significantly outperforming state-of-the-art methods.
View on arXiv@article{lin2025_2411.16800, title={ Phys4DGen: Physics-Compliant 4D Generation with Multi-Material Composition Perception }, author={ Jiajing Lin and Zhenzhong Wang and Dejun Xu and Shu Jiang and YunPeng Gong and Min Jiang }, journal={arXiv preprint arXiv:2411.16800}, year={ 2025 } }