Workflow
021科学基础模型
icon
Search documents
科学有答案 创新无止境(院士讲科普·年终特别报道) ——八位院士眼中的二〇二五
Ren Min Ri Bao· 2025-12-26 22:15
Group 1: Technological Innovation and Achievements - In 2025, China made significant advancements in scientific research, particularly in quantum technology and lunar studies, achieving original breakthroughs [1] - The global innovation index ranked China 10th, reinforcing the foundation for high-level technological self-reliance [1] - Major breakthroughs in lunar science were achieved with the Chang'e 6 mission, revealing new lunar oxidation reaction mechanisms [2][3] Group 2: Quantum Computing Developments - Quantum computing is identified as a key future technology, with China focusing on achieving full autonomy in its development [4] - The third-generation superconducting quantum computer "Benyuan Wukong" has been launched, participating in numerous application collaborations across various sectors [4][6] - The development of quantum computing is likened to building a rocket, emphasizing the importance of both hardware and software integration [4] Group 3: Brain-Computer Interface Technology - Brain-computer interface technology is evolving to connect biological intelligence with machine intelligence, with significant clinical advancements reported in 2025 [7][9] - The technology is moving from unidirectional reading to bidirectional interaction, enhancing its potential applications [7] Group 4: Digital Intelligence and Industrial Transformation - Digital intelligence technologies, including cloud computing and AI, are seen as essential tools for enhancing innovation capabilities across industries [10][11] - The integration of digital technologies into industrial processes is expected to transform manufacturing paradigms and improve efficiency [10] Group 5: Hydrogen Energy and Renewable Resources - Hydrogen energy is recognized as a crucial component of the renewable energy revolution, complementing electricity and promoting low-carbon transitions [12][13] - China has made progress in mastering hydrogen fuel cell technologies and establishing related industrial chains [12][13] Group 6: Research Talent Development - The cultivation of research talent is emphasized as critical for supporting high-level technological self-reliance, with a focus on aligning educational programs with national strategic needs [14][15] - Young researchers are increasingly contributing to scientific advancements, with a significant proportion of key research personnel being under 45 years old [15] Group 7: Support for Basic Research - Basic research in China is receiving increased attention and support, although challenges remain in funding and stability [16][17][18] - The government is implementing reforms to enhance the support for basic research, including funding mechanisms and cultural improvements [16][17][18] Group 8: Science Popularization and Innovation - The relationship between scientific innovation and public science education is highlighted, with efforts to engage the public in scientific knowledge [19][20] - Enhancing public understanding of science is seen as foundational for fostering future research talent and innovation [19][20]
之江实验室薛贵荣:当AI开始做科研,我看到了大语言模型的天花板丨GAIR 2025
雷峰网· 2025-12-24 00:22
Core Viewpoint - The GAIR conference highlights the evolution of AI technology and its transition from laboratory research to industrial applications, emphasizing the importance of scientific foundational models to overcome the limitations of large language models in understanding complex scientific data [2][4]. Group 1: Limitations of Large Language Models - Large language models are constrained by "language boundaries," making it difficult for them to comprehend high-dimensional, multi-modal scientific data and to independently achieve verifiable scientific discoveries [4][22]. - In a challenging HLE test covering over 100 disciplines, the best-performing model achieved only a 25.4% accuracy rate, indicating significant limitations in addressing scientific problems [4][18]. - The primary difference between large language models and scientific foundational models lies in their data representation; the latter utilizes cross-disciplinary, multi-type scientific data as tokens, rather than solely text [4][26]. Group 2: Scientific Foundational Models - The 021 scientific foundational model developed by Zhijiang Laboratory aims to break through language limitations and unify scientific data for enhanced reasoning and discovery across disciplines [4][5]. - Tokenizing scientific data effectively is crucial for establishing connections between different types of data, enabling comprehensive analysis of scientific problems across various fields [5][28]. - The model supports applications in 19 key disciplines, covering 174 areas of scientific knowledge, and aims to streamline processes that traditionally require extensive time and resources [31][36]. Group 3: Collaborative Efforts and Future Directions - The initiative involves collaboration with national laboratories, universities, and enterprises to co-create and enhance the model, fostering a deeper understanding of key scientific data and challenges [36][38]. - An open research platform, zero2x, is being developed to facilitate access to data and models, encouraging broader participation in scientific discovery and innovation [38]. - The goal is to transform scientific research paradigms and accelerate the integration of AI into scientific endeavors, ultimately leading to significant advancements in the field [38].
之江实验室021科学基础模型首次亮相 突破语言局限
Zhong Guo Xin Wen Wang· 2025-12-18 23:44
Core Insights - The 021 scientific foundational model was unveiled by Zhejiang Zhijiang Laboratory, showcasing advancements in interdisciplinary knowledge, cross-domain reasoning, and multilingual understanding, covering 204 languages [1][2] - The model aims to overcome the limitations of language in expressing complex scientific concepts, integrating scientific data across multiple dimensions such as time, space, and energy [1] - The development process involved nearly 10,000 experiments, resulting in a training framework that includes pre-training, post-training, and reinforcement learning, culminating in a model with 236 billion parameters [1] Group 1 - The 021 model serves various fields including Earth sciences, astronomy, life sciences, and materials science, acting as a "research partner" that breaks down disciplinary boundaries and stimulates innovative thinking [2]