GMI Studio
Search documents
GMI Cloud:出海是AI企业释放产能、获取新生的最佳途径|WISE 2025
36氪· 2025-12-09 10:38
Core Insights - The core challenge of AI applications going global is the timeliness, scalability, and stability of model inference services [2][18]. Group 1: Event Overview - The 36Kr WISE2025 Business King Conference, recognized as an annual technology and business trendsetter, took place in Beijing on November 27-28 [3]. - This year's WISE is not a traditional industry summit but an immersive experience using "tech short dramas" to convey insights [4]. Group 2: AI Application Trends - GMI Cloud's VP of Engineering, Qian Yujing, presented on the efficiency upgrade of AI applications going global, focusing on breaking through computing power and evolving inference architecture [6][11]. - The company, a North American AI Native Cloud service provider and one of NVIDIA's first six Reference Cloud Partners, emphasizes the importance of globalizing computing power and demand for AI applications [7][8]. Group 3: Market Dynamics - The AI application market is experiencing exponential growth, with a significant increase in monthly active users for Chinese AI applications overseas [15]. - Over 90% of knowledge workers in the U.S. are now proficient in using AI tools, indicating a strong adoption of AI [15]. - The demand for AI services in regions like the Middle East and Latin America has reached a high level, suggesting that user education for overseas markets is largely complete [16]. Group 4: Challenges in AI Globalization - Key challenges in AI globalization include the timely delivery of services, scalability, and stability, particularly due to the rapid technological iterations in AI [18][20]. - The need for companies to keep pace with technological advancements poses a significant challenge for enterprises [21]. Group 5: GMI Cloud's Solutions - GMI Cloud is investing $500 million to build a 3 million card AI factory in Asia in collaboration with NVIDIA [14]. - The company has developed three product lines: computing hardware, cluster management, and inference services, catering to various AI enterprise needs [14]. - The Cluster Engine and Inference Engine are designed to address different customer segments, with the former focusing on complex applications and the latter on lightweight, end-user applications [25][29]. Group 6: Inference Engine Features - The Inference Engine supports global deployment and automatic scaling across clusters and regions, addressing the challenges faced by companies when their traffic peaks [30][31]. - It features a three-layer architecture for resource scheduling, with two main scheduling methods: queue-based and load balancing-based [31]. - The core features of the Inference Engine include global deployment, elastic scaling, high availability, and unified workload management [33][35][36]. Group 7: Future Outlook - By 2026, the paradigm of AI globalization is expected to shift from a one-way technology output to a global value resonance, emphasizing a two-way empowering ecosystem [43]. - The transformation will involve a new cycle of value creation, where computing power, technology, demand, and applications interact globally [43].
GMI Cloud:出海是AI企业释放产能、获取新生的最佳途径|WISE 2025
3 6 Ke· 2025-12-08 10:44
11月27-28日,被誉为"年度科技与商业风向标"的36氪WISE2025商业之王大会,在北京798艺术区传导空间落地。 今年的WISE不再是一场传统意义上的行业峰会,而是一次以"科技爽文短剧"为载体的沉浸式体验。 从AI重塑硬件边界,到具身智能叩响真实世界的大门;从出海浪潮中的品牌全球化,到传统行业装上"赛博义肢"——我们还原的不仅是趋势,更是在捕捉在 无数次商业实践中磨炼出的真知。 我们将在接下来的内容中,逐帧拆解这些"爽剧"背后的真实逻辑,一起看尽2025年商业的"风景独好"。 此次大会上,GMI Cloud工程VP钱宇靖做了主题为《AI应用的出海效能升级·算力破局与推理架构进化》的演讲。 GMI Cloud是一家北美的AI Native Cloud服务商,也是英伟达首批六大Reference Cloud Partner之一。 钱宇靖认为,对于世界用户来说,AI应用多元化发展已经到了一个"武装到牙齿缝"的状态,出海成为了中国公司释放产能、获取新生的最佳途径。 而在当前,中国的AI出海正在经历着一场范式的革新——从过去的单向技术输出,到围绕算力全球化、需求的全球化、价值的全球化转型。这背后,正是 一场隐秘 ...