Core Insights - Fysics AI has launched OmniFysics, the world's first all-modal physical AI foundational model that understands "physical laws," achieving a breakthrough in physical perception and reasoning with only 3 billion parameters [1][2] - The model addresses the prevalent issue of "physical hallucinations" in traditional AI, providing a core technological foundation for advancements in embodied intelligence and humanoid robots [1] Technical Innovations - OmniFysics integrates high-quality physical knowledge into its architecture, allowing it to surpass existing open-source models and even outperform some mainstream models with 8 billion parameters in key metrics such as physical prediction and logical reasoning [2] - The model features a unique "dual hub" data ecosystem, with a static hub (FysicsAny) creating a comprehensive physical labeling system for various objects, and a dynamic hub (FysicsOmniCap) enabling precise capture of physical causal relationships through collaborative training with video and audio [2] Evaluation and Benchmarking - Fysics AI has introduced FysicsEval, the first all-dimensional embodied physical perception and logical reasoning evaluation benchmark, which, along with the previously launched FysicsWorld platform, forms a comprehensive intelligent physical evaluation system [12] - This new evaluation framework aims to fill the gap in current physical AI assessments that are disconnected from real-world physics, providing accurate and efficient model diagnostic tools for AI research teams [12] Future Directions - The company aims to continue advancing in the field of physical AI, focusing on practical industry needs and promoting deep integration of technology and applications [12] - The successful launch of OmniFysics signifies a shift from mere "semantic understanding" to a more rigorous "physical reality," establishing a solid foundation for future embodied intelligent agents that can truly understand and interact with the physical world [12]
飞捷科思发布全球首套真正物理AI基础模型OmniFysics
Huan Qiu Wang Zi Xun·2026-02-09 11:09