Group 1: Anthropic's Financial Projections - Anthropic has lowered its gross margin projection for 2025 to 40%, which is a decrease of 10 percentage points from earlier expectations, but still an improvement from the previous year [3] - If inference costs for non-paying users of the Claude chatbot are included, the gross margin would be approximately 38% [3] - Anthropic's projected gross margins are expected to exceed 70% by 2027, while OpenAI anticipates similar margins by 2029, indicating a trend towards profitability in the AI sector despite high training costs [3] Group 2: AI Model Training Costs - Anthropic's expected costs for training AI models in 2025 are projected to be around $4.1 billion, reflecting a 5% increase from previous estimates [4] - OpenAI's training costs for AI models were approximately $9.4 billion last year, highlighting the significant financial investment required in AI development [4] Group 3: ChatGPT's Business Model and Growth - ChatGPT's revenue has grown 3X year over year, reaching $20 billion+ in 2025, up from $2 billion in 2023, indicating unprecedented growth in the AI sector [5] - The compute capacity used by ChatGPT has also increased significantly, growing from 0.2 GW in 2023 to approximately 1.9 GW in 2025, which correlates with revenue growth [5] Group 4: AWS and AI Infrastructure - AWS has developed its own custom CPU, Graviton, which offers 40% better price performance compared to leading x86 processors, and is now used by 90% of its top 1,000 customers [17][18] - AWS's Trainium2 chip, which is utilized by Anthropic for training models, has been fully subscribed, and the newly released Trainium3 chip is expected to be 40% more price performant than its predecessor [19] Group 5: Market Dynamics and AI Adoption - The current stage of AI adoption is characterized by high demand, with AI labs consuming significant compute resources, while enterprises are beginning to utilize AI for cost avoidance and productivity [20][21] - There is a notable gap in the market where many enterprise workloads are not yet using AI inference, suggesting potential for future growth as these applications are deployed [22]
What We’re Reading (Week Ending 01 February 2026) : The Good Investors %