Core Viewpoint - Elon Musk's xAI has officially open-sourced Grok 2.5, with Grok 3 expected to be released in six months, generating significant attention in the AI community [1][2]. Group 1: Open Source Release - Grok 2.5 is now available for download on HuggingFace, consisting of 42 files totaling approximately 500GB [3][4]. - The model requires a minimum of 8 GPUs, each with over 40GB of memory, to operate effectively [4][10]. Group 2: Model Performance - Grok 2 has surpassed Claude and GPT-4 in overall Elo scores on the LMSYS leaderboard, indicating competitive performance [4]. - In various academic benchmark tests, Grok 2 has shown strong results in areas such as graduate-level scientific knowledge (GPQA), general knowledge (MMLU, MMLU-Pro), and mathematics (MATH) [7][8]. Group 3: Community Feedback - While the open-sourcing of Grok has been positively received, there are criticisms regarding the lack of clarity on model parameters and the open-source licensing terms [9]. - The community has noted that the open-source model's parameters are speculated to be around 269 billion in a MoE configuration, but this has not been explicitly confirmed by xAI [9].
刚刚,马斯克开源Grok 2.5:中国公司才是xAI最大对手
Sou Hu Cai Jing·2025-08-24 01:29