代码大模型

Search documents
中国社科院人工智能研究促进中心揭牌
Ke Ji Ri Bao· 2025-08-13 00:07
12日,中国社会科学院人工智能研究促进中心在京揭牌。中国社会科学院秘书长、人工智能研究促进中 心理事长赵志敏在揭牌仪式上指出,哲学社会科学工作者要回答"智能革命向何处去"的根本问题,破 解"技术异化""算法霸权"等世界性难题,为国家参与全球人工智能治理体系改革和建设提供系统性、前 瞻性、可操作的"中国方案"和"中国答案"。 赵志敏说,人工智能发展和治理不仅是科技问题,更涉及经济、社会、文化、法律、伦理、外交和国际 政治等方面。中国社会科学院人工智能研究促进中心将以人工智能与哲学社会科学重大理论、现实问题 为主攻方向,努力打造"体现国家意志、实现国家使命、代表国家水平"的人工智能哲学社会科学研究前 沿阵地和科研平台,把人工智能研究打造成为自然科学、社会科学融合研究的新范式。 "人工智能已成为推动产业变革、提升国家竞争力的核心力量。优秀的通用任务处理能力,使得大模型 迅速成为各应用领域的基础支撑技术。"中国科学院院士、清华大学计算机科学与技术系教授胡事民 说,大模型正在重塑社会科学研究方法。近年来,大语言模型逐渐被引入社会科学研究方法体系。凭借 对自然语言与人类行为的模拟能力,这类工具不仅可降低研究成本,还拓展了 ...
代码大模型落地国有银行,aiXcoder助开发效率提升30%
Feng Huang Wang· 2025-07-11 13:06
Core Insights - aiXcoder's intelligent software development solution has been recognized as an "Outstanding Case in Software R&D" at the TiD 2025 Quality Competitiveness Conference due to its successful application in a state-owned bank, resulting in a 30% increase in overall development efficiency [1] Group 1: Technology Implementation - The solution includes three key technological implementations: 1. Deployment of a code large model trained specifically for code characteristics, enhancing performance in software development scenarios through context-aware code generation, completion, defect fixing, and unit test generation [1] 2. Personalized training for banking-specific code, utilizing the bank's private code and documentation to create a bespoke code large model that aligns with the bank's business logic and coding style, all while maintaining the core model's performance [1] 3. Private deployment that meets strict security requirements, operating entirely within the internal network to ensure data security, optimizing hardware resource usage, and supporting high concurrency scenarios [2] Group 2: Performance Metrics - The proportion of AI-generated code in development has increased from 10% before training to 35% after implementation, with specific scenarios allowing for up to 60% of coding tasks to be assisted by AI [2]
英伟达开源多个代码大模型 以阿里通义千问为底座
news flash· 2025-05-09 07:42
Core Viewpoint - Nvidia has recently open-sourced its code reasoning model, which includes three parameter versions: 32B, 14B, and 7B [1] Group 1 - The open-source model is based on Alibaba's Tongyi Qianwen Qwen2.5 series, specifically Qwen2.5-32B, Qwen2.5-14B, and Qwen2.5-7B [1]