Workflow
PhysXGen
icon
Search documents
NeurIPS 2025 Spotlight | PhysX-3D:面向真实物理世界的3D资产生成范式
机器之心· 2025-10-11 08:06
Core Insights - The article presents PhysXNet, the first systematically annotated 3D dataset based on physical properties, addressing the gap between virtual 3D assets and real-world physics [6][9][27] - It introduces PhysXGen, a novel framework for generating 3D assets that incorporates physical attributes, enhancing the realism and applicability of 3D models in various fields [9][18][27] Dataset Overview - PhysXNet includes over 26,000 annotated 3D objects with detailed physical properties, while the extended version, PhysXNet-XL, contains over 6 million programmatically generated 3D objects [9][10][16] - The dataset covers five core dimensions: physical scale, materials, affordance, kinematic information, and textual descriptions, providing a comprehensive resource for 3D modeling [6][9][27] Annotation Process - A human-in-the-loop annotation framework was developed to efficiently collect and label physical information, ensuring high-quality data [11][13] - The annotation process involves two main stages: initial data collection and determination of kinematic parameters, utilizing advanced models like GPT-4o for accuracy [13][11] Generation Methodology - PhysXGen integrates physical attributes with geometric structure and appearance, achieving a dual optimization goal for generating realistic 3D assets [18][27] - The framework demonstrates significant improvements in generating physical properties compared to existing methods, with relative performance enhancements in various dimensions [23][24] Experimental Results - The evaluation of PhysXGen shows notable advancements in both geometric quality and physical property accuracy, outperforming baseline methods in multiple metrics [20][21][23] - The results indicate a 24% improvement in physical scale, 64% in materials, 28% in kinematic parameters, and 72% in affordance compared to traditional approaches [23][24] Conclusion - The article emphasizes the importance of bridging the gap between 3D assets and real-world physics, highlighting the potential impact of PhysXNet and PhysXGen on fields such as embedded AI, robotics, and 3D vision [27]
3D生成补上物理短板!首个系统性标注物理3D数据集上线,还有一个端到端框架
量子位· 2025-07-23 04:10
Core Viewpoint - The article discusses the introduction of PhysXNet, the first systematically annotated physical property 3D dataset, which aims to bridge the gap between virtual 3D generation and physical realism [1][3]. Group 1: Introduction of PhysXNet - PhysXNet contains over 26,000 richly annotated 3D objects, covering five core dimensions: physical scale, materials, affordance, kinematic information, and textual descriptions [3][11]. - An extended version, PhysXNet-XL, includes over 6 million programmatically generated 3D objects with physical annotations [12]. Group 2: Current Research Landscape - Existing 3D generation methods primarily focus on geometric structure and texture, neglecting the modeling based on physical properties [2][8]. - The demand for physical modeling, understanding, and reasoning in 3D space is increasing, necessitating a comprehensive physical-based 3D object modeling system [8][9]. Group 3: Data Annotation Process - The team designed a human-in-the-loop annotation process to efficiently collect and annotate physical information [16][19]. - The annotation framework consists of two main phases: initial data collection and determination of kinematic parameters [19]. Group 4: Generation Methodology - PhysXGen is introduced as a novel framework for generating 3D assets with physical properties, utilizing pre-trained 3D priors to achieve efficient training and good generalization [13][26]. - The method synchronously integrates basic physical properties during the generation process, optimizing structural branches for dual objectives [29][30]. Group 5: Experimental Evaluation - The team conducted qualitative and quantitative evaluations of the model, comparing it against a baseline that uses a separate structure to predict physical properties [33][34]. - PhysXGen demonstrated significant performance improvements in generating physical attributes, achieving relative performance gains of 24%, 64%, 28%, and 72% across various dimensions [38]. Group 6: Future Directions - The article emphasizes the importance of addressing key challenges in physical 3D generation tasks and outlines future research directions [43].