Workflow
NVIDIA H100服务器
icon
Search documents
面对芯片折旧,市场不淡定了
3 6 Ke· 2025-11-24 10:25
Core Insights - Michael Burry, known for predicting the 2008 financial crisis, has raised concerns about the depreciation practices of AI chip manufacturers, suggesting that they artificially inflate profits by extending the depreciation period of chips [1][2] - Burry estimates that from 2026 to 2028, this accounting treatment could lead to an underestimation of approximately $176 billion in depreciation expenses across the industry, specifically highlighting Oracle and Meta, which he predicts could have their profits overstated by 27% and 21% respectively by 2028 [1] Depreciation Practices - Depreciation in the context of AI data centers refers to the allocation of the cost of fixed assets over their expected useful life, which significantly impacts financial statements [3] - Extending the depreciation period allows companies to report lower depreciation expenses, thus enhancing current net profit figures [3] Industry Trends - Major tech companies have recently adopted longer depreciation periods for their server assets, with Microsoft extending its server lifespan from four to six years in 2022, and Google doing the same in 2023 [4][5] - Oracle and Meta have also extended their server lifespans, with Meta estimating a reduction of $2.9 billion in depreciation expenses for 2025 due to this adjustment [6] Potential Risks - If the lifespan of servers is overestimated, it could lead to significant profit reductions; for instance, if the servers lose value within three years instead of the assumed lifespan, the total pre-tax profit of the five major cloud giants could decrease by $26 billion, equating to 8% of last year's total profit [6] - A recalculation assuming a two-year depreciation period could result in a total value loss of $1.6 trillion for these companies [6] Chip Lifespan Debate - There is a growing belief that the actual lifespan of AI chips may be shorter than currently estimated due to high physical wear and rapid technological obsolescence [7][9] - High utilization rates in data centers can lead to GPU lifespans of only one to three years, with significant operational costs arising from hardware instability [9] Economic Considerations - The economic lifespan of assets is becoming critical, especially as power capacity in data centers becomes a bottleneck; the efficiency of older chips compared to newer models can lead to opportunity costs [11] - Companies like NVIDIA are shortening their product iteration cycles, which further pressures the lifespan of existing chips [11] Value Cascade Model - Some analysts argue that the longer depreciation periods adopted by tech giants are justified due to their "value cascade" model, which allows for a tiered utilization of hardware based on workload demands [12] - This model suggests that older chips can still be effectively used for less demanding tasks, extending their economic lifespan beyond the typical technological cycle [12][13] Financial Implications - The significant capital expenditures (CapEx) by major tech companies are supported by strong order backlogs, indicating a high demand for AI capabilities [13] - The strategy of extending depreciation periods may be a prudent financial approach to stabilize profits and investor expectations amid high capital spending [13] Conclusion - The debate over AI chip depreciation reflects a mismatch between rapid technological advancements and asset management strategies, necessitating a shift in how the industry evaluates company performance beyond just net profit [14] - Companies that can effectively manage their capital expenditures and generate strong cash flows will be better positioned to navigate the challenges posed by technological iterations [15]