Digital Twin Catalog: A Large-Scale Photorealistic 3D Object Digital Twin Dataset

We introduce Digital Twin Catalog (DTC), a new large-scale photorealistic 3D object digital twin dataset. A digital twin of a 3D object is a highly detailed, virtually indistinguishable representation of a physical object, accurately capturing its shape, appearance, physical properties, and other attributes. Recent advances in neural-based 3D reconstruction and inverse rendering have significantly improved the quality of 3D object reconstruction. Despite these advancements, there remains a lack of a large-scale, digital twin quality real-world dataset and benchmark that can quantitatively assess and compare the performance of different reconstruction methods, as well as improve reconstruction quality through training or fine-tuning. Moreover, to democratize 3D digital twin creation, it is essential to integrate creation techniques with next-generation egocentric computing platforms, such as AR glasses. Currently, there is no dataset available to evaluate 3D object reconstruction using egocentric captured images. To address these gaps, the DTC dataset features 2,000 scanned digital twin-quality 3D objects, along with image sequences captured under different lighting conditions using DSLR cameras and egocentric AR glasses. This dataset establishes the first comprehensive real-world evaluation benchmark for 3D digital twin creation tasks, offering a robust foundation for comparing and improving existing reconstruction methods. The DTC dataset is already released atthis https URLand we will also make the baseline evaluations open-source.
View on arXiv@article{dong2025_2504.08541, title={ Digital Twin Catalog: A Large-Scale Photorealistic 3D Object Digital Twin Dataset }, author={ Zhao Dong and Ka Chen and Zhaoyang Lv and Hong-Xing Yu and Yunzhi Zhang and Cheng Zhang and Yufeng Zhu and Stephen Tian and Zhengqin Li and Geordie Moffatt and Sean Christofferson and James Fort and Xiaqing Pan and Mingfei Yan and Jiajun Wu and Carl Yuheng Ren and Richard Newcombe }, journal={arXiv preprint arXiv:2504.08541}, year={ 2025 } }