Workflow
100美元、8000行代码手搓ChatGPT,Karpathy最新开源项目爆火,一夜近5k star
NKYNKY(SZ:300109) 3 6 Ke·2025-10-14 02:25

Core Insights - Andrej Karpathy has released a new open-source project called nanochat, which allows users to build a ChatGPT-like model from scratch for approximately $100 [2][5] - The project consists of around 8,000 lines of code and was quickly adopted by the community, gaining over 4,500 stars on GitHub within 12 hours [2][5] - nanochat provides a complete training and inference pipeline for large language models (LLMs), differing from Karpathy's previous project, nanoGPT, which only covered the pre-training phase [2][5] Project Details - Users can train their own LLM by running a script on a cloud GPU machine, achieving a functional model in about 4 hours [2][3] - The project includes features such as a new Rust-based tokenizer, a high-efficiency inference engine, and automatic generation of Markdown scorecards summarizing the training process [3][5] - Karpathy estimates that with a budget of $1,000 and 41.6 hours of training, users can achieve significant improvements in model coherence and performance on various tasks [4][5] Performance Metrics - Initial CORE scores for the model were recorded at 0.2219, with improvements noted during different training phases [7] - The model's performance on specific benchmarks includes scores such as 40+ on MMLU and 70+ on ARC-Easy after sufficient training [4][7] Community and Future Development - Karpathy envisions nanochat evolving into a research platform or standard benchmark, similar to nanoGPT, and encourages community collaboration for further improvements [5][8] - Despite its capabilities, Karpathy cautions that nanochat is not suitable for personalized applications without significant additional work and data preparation [9][10]