人工智能+科学
Search documents
之江实验室薛贵荣:当AI开始做科研,我看到了大语言模型的天花板丨GAIR 2025
雷峰网· 2025-12-24 00:22
Core Viewpoint - The GAIR conference highlights the evolution of AI technology and its transition from laboratory research to industrial applications, emphasizing the importance of scientific foundational models to overcome the limitations of large language models in understanding complex scientific data [2][4]. Group 1: Limitations of Large Language Models - Large language models are constrained by "language boundaries," making it difficult for them to comprehend high-dimensional, multi-modal scientific data and to independently achieve verifiable scientific discoveries [4][22]. - In a challenging HLE test covering over 100 disciplines, the best-performing model achieved only a 25.4% accuracy rate, indicating significant limitations in addressing scientific problems [4][18]. - The primary difference between large language models and scientific foundational models lies in their data representation; the latter utilizes cross-disciplinary, multi-type scientific data as tokens, rather than solely text [4][26]. Group 2: Scientific Foundational Models - The 021 scientific foundational model developed by Zhijiang Laboratory aims to break through language limitations and unify scientific data for enhanced reasoning and discovery across disciplines [4][5]. - Tokenizing scientific data effectively is crucial for establishing connections between different types of data, enabling comprehensive analysis of scientific problems across various fields [5][28]. - The model supports applications in 19 key disciplines, covering 174 areas of scientific knowledge, and aims to streamline processes that traditionally require extensive time and resources [31][36]. Group 3: Collaborative Efforts and Future Directions - The initiative involves collaboration with national laboratories, universities, and enterprises to co-create and enhance the model, fostering a deeper understanding of key scientific data and challenges [36][38]. - An open research platform, zero2x, is being developed to facilitate access to data and models, encouraging broader participation in scientific discovery and innovation [38]. - The goal is to transform scientific research paradigms and accelerate the integration of AI into scientific endeavors, ultimately leading to significant advancements in the field [38].
用好AI这个科研超级助手
Jing Ji Ri Bao· 2025-10-22 22:09
Group 1 - The core viewpoint of the article emphasizes the importance of implementing the "Artificial Intelligence +" action plan to seize opportunities in the new wave of scientific research driven by AI [1] - The article highlights the significant advancements in AI-driven scientific research, exemplified by AlphaFold2's ability to predict the structures of approximately 200 million proteins, solving a long-standing challenge in the field [1] - It points out the existing challenges in AI-driven scientific research, such as the lack of high-quality scientific data and insufficient algorithm interpretability, which hinder deeper development [1] Group 2 - The article discusses the urgent need to establish a national-level data platform and computing network to address issues related to data quality, reliance on foreign databases, and lack of unified data standards [2] - It emphasizes the necessity of cultivating interdisciplinary talent to tackle the shortage of professionals in AI and related fields [2] - The article calls for collaboration across fields and departments to enhance the role of AI as a powerful assistant for scientists, thereby improving research efficiency and innovation potential [2]
【中国新闻网】开启科研无限可能 中国团队发布“磐石·科学基础大模型”
Zhong Guo Xin Wen Wang· 2025-07-28 03:04
Core Insights - The "Panshi Scientific Foundation Model" was officially launched on July 26, 2025, aiming to provide robust intelligent support for technological innovation across various fields, leveraging AI to reshape scientific research paradigms [4][11] - The model addresses challenges in the current "AI + Science" research landscape, such as data silos and insufficient reasoning capabilities, by promoting a platform-based and systematic transformation [5][11] Group 1: Model Capabilities - The "Panshi Scientific Foundation Model" is trained on specialized scientific knowledge and data, enabling deep understanding of various scientific modalities, including waves, spectra, and fields [4][7] - It features a heterogeneous mixed expert architecture, integrating proprietary models tailored for common scientific data modalities, and has achieved top performance in international datasets across mathematics, physics, chemistry, materials, and biology [7][9] Group 2: Applications and Efficiency - The model has been applied in multiple disciplines, significantly accelerating research processes, such as achieving over 10 times faster efficiency in drug target discovery in life sciences [9][10] - It supports the automation of particle physics research tasks and enhances the efficiency of high-speed train model calculations in fluid environments [9][10] Group 3: Tools and Ecosystem - The "Panshi Literature Compass" assists researchers in literature review and evaluation, processing 170 million scientific documents and reducing research time from days to minutes [8][11] - The "Panshi Tool Scheduling Platform" allows for the autonomous planning and invocation of over 300 scientific computing tools, improving research workflow efficiency [8][11] Group 4: Collaborative Initiatives - The Chinese Academy of Sciences has initiated the "Scientific Foundation Model Ecological Alliance" plan, collaborating with over 40 research institutions, universities, and enterprises to foster a new ecosystem for "AI + Science" [11]