scaling

Search documents
X @The Block
The Block· 2025-07-20 20:23
Vitalik Buterin touts 'safe' scaling as Ethereum gas limit ticks up https://t.co/HPL8OqRj0k ...
X @TechCrunch
TechCrunch· 2025-07-20 18:05
Former Tesla president discloses the secret to scaling a company | TechCrunch https://t.co/3HnhFAllt0 ...
永赢基金李文宾:低估值和硬科技重塑中国资产价值
Shang Hai Zheng Quan Bao· 2025-07-20 15:54
◎记者 王彭 今年初,DeepSeek的横空出世引发A股人工智能板块走出强劲上涨行情。站在当下时点,李文宾表示, 人工智能产业的发展在不同阶段有不同的发展重点。人工智能不是一个新的话题,而是科技发展方向。 从电子计算机面世以来,人类就希望通过机器代人的方式大幅提升生产效率、降低生产成本和提升幸福 感。 "2022年下半年以来,在大语言模型取得重大突破后,我们朝着实现'机器智能'时代迈进了一大步,但 离真正的'智人'依然很远。但我们需要认识到从scaling law(规模化法则)上来看,大力出奇迹依然是 本轮人工智能产业取得突破的基础。目前国内外基础大模型领域仍然在快速迭代,这背后还是基于上游 算力的堆砌。"李文宾称。 在李文宾看来,虽然DeepSeek大模型可以部分降低算力需求,但依然没有拉开和海外竞争者的差距, 甚至这3个月以来海外模型的迭代又取得了超乎寻常的效果。因此,对于资本市场而言,上游基础设施 企业的发展依然会持续亮眼。 李文宾表示,目前全球在应用端都在不断尝试,在AI+IT、AI+教育、AI+影视、AI+政务、AI+国防安 全等领域都已涌现出很多"瞪羚企业"。不少公司的收入从2024年开始持续高增 ...
X @Polygon
Polygon· 2025-07-18 15:36
RT Chainspect (@chainspect_app)🚨 @0xPolygon just outdid itself (again)On July 17, it processed 4.4M txns, surpassing its previous 30-day high of 4.32MWith the Heimdall v2, Polygon's scaling momentum is turning into consistency🧱 Blocks: 74,025,604 to 74,065,602Track it live → https://t.co/ETGaQfeICJ https://t.co/pbysw8m1gH ...
计算机行业双周报(2025、7、4-2025、7、17):Grok4发布验证ScalingLaw依然有效,英伟达将重启H20对华供货-20250718
Dongguan Securities· 2025-07-18 14:49
陈伟光 S0340520060001 电话:0769-22119430 邮箱: chenweiguang@dgzq.com.cn 计算机行业 2025 年 7 月 18 日 卢芷心 S0340524100001 电话:0769-22119297 邮箱: luzhixin@dgzq.com.cn 罗炜斌 S0340521020001 电话:0769-22110619 邮箱: luoweibin@dgzq.com.cn 资料来源:iFinD,东莞证券研究所 超配(维持) 计算机行业双周报(2025/7/4-2025/7/17) 行 业 Grok 4 发布验证 Scaling Law 依然有效,英伟达将重启 H20 对华供货 投资要点: 本报告的风险等级为中高风险。 本报告的信息均来自已公开信息,关于信息的准确性与完整性,建议投资者谨慎判断,据此入市,风险自担。 请务必阅读末页声明。 SAC 执业证书编号: 计 算 机 行 业 指 数 涨 跌 幅 及 估 值 : 申 万 计 算 机 板 块 近 2 周 (2025/7/4-2025/7/17)累计上涨4.98%,跑赢沪深300指数3.31个百分 点,在31个申万一级 ...
Thinking Machines Lab获20亿美元种子轮融资,人才成为AI行业最重要的要素
3 6 Ke· 2025-07-17 23:56
Core Insights - Thinking Machines Lab, founded by former OpenAI CTO Mira Murati, has raised $2 billion in seed funding led by a16z, achieving a valuation of $12 billion, marking it as the largest seed funding round in tech history [1][2] - The initial funding target was $1 billion with a valuation of $9 billion, but the final amount increased significantly over a few months [1] - The company currently lacks specific product offerings and revenue, with only a high-profile founding team and vague technological direction publicly available [1] Company Overview - Mira Murati has been with OpenAI since 2016, serving as CTO and leading the development of groundbreaking technologies like GPT-3, GPT-4, DALL-E, and ChatGPT [2] - The founding team includes notable AI experts such as John Schulman, Barret Zoph, Bob McGrew, Alec Radford, Alexander Kirillov, Jonathan Lachman, and Lilian Weng, all of whom have significant contributions to AI advancements [4][5][7][9][12][13][15] Talent Acquisition in AI Industry - The competition for top AI talent has intensified, with companies like Anthropic, Safe Superintelligence, and Thinking Machines Lab emerging as key players, all led by elite AI researchers [17] - The trend indicates that talent is becoming the most critical factor in the AI industry, surpassing computational power and data [17] - Major tech companies are aggressively acquiring talent, as seen in Meta's recruitment efforts, which include significant investments and hiring from various AI firms [18][19][20] Future Product Development - Thinking Machines Lab plans to release its first product within months, focusing on open-source components and AI solutions tailored to business KPIs, referred to as "reinforcement learning for businesses" [16] - The company emphasizes multimodal capabilities and effective safety measures for AI systems, aligning with industry trends towards responsible AI development [16]
X @Starknet 🐺🐱
Starknet 🐺🐱· 2025-07-17 15:01
Starknet is wild.Starknet is scaling.Starknet is home.Hear it from the anons who’ve been building here for years. https://t.co/LroqoESi9o ...
X @The Block
The Block· 2025-07-17 12:07
Base activates Flashblocks, slashing block times to 200ms in Ethereum scaling race https://t.co/u74E2WMYoX ...
李开复:中美大模型竞争关键在于开源与闭源之争
格隆汇APP· 2025-07-17 11:06
Core Insights - The future of technology in the next 5 to 10 years will be dominated by generative AI, which is considered a significant leap from ChatBot to Agent [3][4] - The competition between the US and China in AI is not about which company is stronger, but rather a contest between open-source and closed-source approaches [5][16] Investment Opportunities - Nvidia remains a solid investment choice, but investors should look for the right entry points [6][19] - Among the US tech giants, Microsoft is favored due to its willingness to invest boldly and its clear understanding of profitable business models [22] AI Development Trends - The era of AI 2.0, driven by generative AI, is expected to create substantial economic value across various industries [8] - The scaling law for pre-training has reached its limits, while the scaling law for inference is emerging as a new paradigm for model intelligence growth [9][10] - China's open-source model development is catching up to the US, with significant contributions from companies like Alibaba and DeepSeek [13][17] Competitive Landscape - The US has strong payment capabilities from both enterprises and consumers, which China has yet to match [14] - The key competition between the US and China lies in the open-source versus closed-source model, with China currently favoring the open-source route [15][16]
Token推动计算Compute需求:非线形增长
HTSC· 2025-07-17 10:46
Investment Rating - The report maintains an "Overweight" rating for the technology and computer sectors [6]. Core Insights - The demand for computing power is expected to grow non-linearly due to the rise of Agentic AI, with token usage projected to increase by over 10 times, leading to a corresponding increase in computing power demand by over 100 times [1][90]. - The report highlights three scaling laws: pre-training scaling, post-training scaling, and inference scaling, which collectively indicate that the demand for computing power will continue to grow significantly [10][11]. - The relationship between token consumption and computing power demand is not linear, with a 10-fold increase in token usage potentially resulting in a 100-fold increase in required computing power [60][90]. Summary by Sections Token Demand and Computing Power - Token usage and computing power demand are expected to grow non-linearly, with the complexity of inference processes requiring significantly more computing resources as token usage increases [1][60]. - The report cites Huang Renxun's statement that a 10-fold increase in token volume could lead to a 100-fold increase in computing power requirements due to the complexity of inference processes [1][60]. Scaling Laws - The report discusses three scaling laws: pre-training scaling, post-training scaling, and inference scaling, emphasizing that the market may be underestimating the future demand for computing power due to concerns about the peak of pre-training scaling [10][11]. - Inference scaling is particularly important for improving model performance on difficult problems, which is essential for the development of Agentic AI [15][19]. Agentic AI and Token Consumption - The report identifies Deep Research as a significant driver of token consumption, with estimates suggesting that its token usage could be up to 50 times that of a single chat interaction [3][50]. - The complexity of tasks handled by Agentic AI leads to higher token consumption, with the potential for token usage to exceed 100 times that of traditional chat interactions in more complex scenarios [57][58]. Future Outlook - The report concludes that the future demand for computing power will be driven by the dual factors of increasing token usage and the complexity of inference tasks, indicating a broad space for growth in computing power demand [89][90].