Workflow
六年来首次!OpenAI新模型开放权重,Altman称为"全球最佳开放模型"
Hua Er Jie Jian Wen·2025-08-05 20:05

Core Insights - OpenAI has released two open-weight language models, gpt-oss-120b and gpt-oss-20b, marking its first open-weight model launch since 2019 and responding to competition from Meta, Mistral AI, and DeepSeek [1][2][12] Model Specifications - gpt-oss-120b and gpt-oss-20b are designed for low-cost options, with gpt-oss-20b able to run on a laptop with 16GB RAM and gpt-oss-120b requiring approximately 80GB RAM [2][5] - gpt-oss-120b has a total of 117 billion parameters, activating 5.1 billion parameters per token, while gpt-oss-20b has 21 billion parameters, activating 3.6 billion parameters per token [5][6] Performance Evaluation - gpt-oss-120b performs comparably to OpenAI's o4-mini in core inference benchmarks, while gpt-oss-20b matches or exceeds the performance of o3-mini [7][8] - Both models utilize advanced pre-training and post-training techniques, focusing on efficiency and practical deployment across environments [5][11] Security Measures - OpenAI has implemented extensive security measures to prevent malicious use of the models, filtering harmful data during pre-training and conducting specialized fine-tuning for security assessments [11] - The company collaborates with independent expert groups to evaluate potential security risks associated with the models [11] Market Impact - The release of these models is seen as a strategic shift for OpenAI, which had previously focused on proprietary API services, now responding to competitive pressures in the open-weight model space [12][15] - OpenAI has partnered with major cloud service providers like Amazon to offer these models, enhancing accessibility for developers and researchers [3][11]