Cloud Native
Search documents
T3出行全量业务成功迁移至腾讯云,创行业最大规模纪录
Xin Lang Cai Jing· 2026-01-05 12:33
Core Insights - T3 Mobility has successfully migrated all its operations to Tencent Cloud, marking the largest and most complex cloud migration in the mobility industry to date [1][3] - The migration involved over ten core business domains, thousands of microservices, and hundreds of algorithm models, with a total data storage scale reaching the petabyte level [3][5] - The migration was completed in under 2 hours, with core switching taking less than 5 minutes, ensuring a seamless transition for users [5][1] Cost Optimization and Performance Improvement - Post-migration, T3 Mobility achieved a significant cost optimization of 30% in computing power expenses [5][1] - The system successfully passed multiple stress tests for increased traffic, enhancing fault diagnosis efficiency and ensuring stable business operations [5][1] Technical Collaboration and Future Prospects - The collaboration between T3 Mobility and Tencent Cloud serves as a reusable benchmark for cloud migration in the mobility and broader internet industry [7] - Future explorations will focus on the application of cloud-native and AI large model technologies in the mobility sector to provide safer and smarter travel services [7]
中兴通讯崔丽:AI应用触及产业深水区 价值闭环走向完备
2 1 Shi Ji Jing Ji Bao Dao· 2025-12-31 23:07
Core Insights - The rapid development of AI large models is becoming a key factor in the new round of technological competition, with a belief that the number of foundational large models will converge to a single-digit figure, while numerous specialized models and applications will emerge across various industries [1] - Physical AI is highlighted as a significant area of focus, accelerating advancements in embodied intelligence and autonomous driving, which are expected to profoundly change societal operations [1] - The transition to the "Agent era" presents challenges in integrating AI technology into the real economy, particularly in terms of legal, compliance, and ethical considerations [1] Physical AI Debate - The emergence of Sora in early 2025 has sparked discussions about "world models" and the competition between two core routes of physical AI: world models and VLA (Visual Language Models) [2] - Sora's development signifies AI's evolution from a "predictor" to a "simulator," marking a paradigm shift necessary for applications like autonomous driving and embodied intelligence [2] - Current models like Sora are criticized for being mere "visual simulators" lacking true physical world modeling capabilities, as they often fail to maintain physical logic [2][3] Model Differentiation - The world model route has diverged into "generative" and "representational" factions, with generative models like Sora focusing on empirical learning from vast sensory data, while representational models emphasize rational deduction through structured internal representations [3] - Generative models are suited for data factories or simulation training, whereas representational models excel in decision-making processes [3] Industry Trends - There is a trend towards the integration of VLA and world models, utilizing VLA for high-level strategy planning and world models for low-level action validation [4] - The evolution of network architecture is shifting from "cloud-native" to "AI-native," necessitating networks to achieve extreme performance and seamless integration of computing and networking [5][6] AI Native Applications - AI applications are transitioning from content generation to autonomous action, with a focus on restructuring entire value chains rather than merely enhancing efficiency in isolated processes [7] - The challenges of deploying agents in critical industries like telecommunications and finance include reconciling the randomness of models with deterministic business needs and ensuring stability in long-term tasks [8] Deep Water Practices - Industries that are likely to achieve scalable AI value realization include education, healthcare, software development, intelligent manufacturing, and urban governance, characterized by high data structuring and rapid feedback mechanisms [9][11] - The transition from "shallow water" to "deep water" signifies AI's deeper integration into core business processes, facing complexities such as multi-modal data and new security threats [12] Hybrid Approaches - The development paths for AI integration may involve a hybrid approach combining "general foundational models + industry fine-tuning" and building industry-specific small models from scratch [12][13] - General models trained on human language may introduce noise in industrial applications, necessitating the creation of specialized models for non-natural language data [13]
Building Hyperscaler Engineered for AI with AI Workload Diversity
DDN· 2025-12-22 23:03
Company Overview - Nscale is a vertically integrated AI stack provider, offering end-to-end solutions from infrastructure to cloud [1] - The company customizes data centers for customers, optimizing for specific workloads, similar to a hyperscaler approach for private clouds [2][3] - Nscale is building the largest supercomputer cluster with Microsoft in Europe, comprising approximately 23,000 nodes [4] Technology and Services - Nscale supports diverse AI workloads including model training, fine-tuning, and inference, accommodating various parameters [5] - The company embraces Kubernetes and SLURM for orchestration, providing managed services and bare metal as a service [9][10] - Nscale offers an open AI API compatible interface, enabling scaling and deployment of open source or proprietary models, along with fine-tuning services [12] - The platform supports both Nvidia and AMD GPUs, catering to different customer requirements [13] Future Directions - Nscale aims to provide a global fleet management solution, integrating on-premise and public/private cloud solutions for a consistent customer experience [14] - The company plans to further diversify its AI services, focusing on open source systems and enterprise features like fine-grained access controls [15] - Nscale supports the open-source community through Hugging Face, acting as an inference provider [16]
IBM’s $11B Deal Sends Confluent (CFLT) Into the Spotlight as Analysts Stay Bullish
Yahoo Finance· 2025-12-11 12:48
Core Viewpoint - Confluent, Inc. is currently a prominent AI stock on Wall Street, with IBM announcing its acquisition for $11 billion, paying $31 per share for all outstanding common shares [1][2]. Group 1: Acquisition Details - IBM's acquisition of Confluent is valued at $11 billion, with a cash offer of $31 per share for all issued and outstanding common shares [1]. - Bernstein has reiterated an "Outperform" rating on Confluent with a price target of $31.00 following the acquisition announcement [1]. Group 2: Market Position and Technology - Confluent has been a long-standing acquisition target due to its relatively low multiple and strong market position [2]. - The company's technology, particularly its open-source middleware technology like Kafka and Flink, has gained further attention after OpenAI committed to using it for its next-generation model infrastructure [3]. - Confluent's technology is crucial for generative AI, which is more effectively delivered through Cloud Native technology, reinforcing its relevance in the tech landscape [4]. Group 3: Correlation with Major Platforms - Confluent enjoys a strong correlation with AWS, enhancing its position within the technology sector [4].
SUSE Announces New Cloud Native Features Compatible with Amazon Linux
Globenewswire· 2025-12-01 14:00
Core Insights - SUSE has announced a collaboration with Amazon Web Services (AWS) to enhance the cloud native Linux experience for Amazon Linux, providing thousands of additional enterprise-grade open source packages through the Supplementary Packages for Amazon Linux (SPAL) service [1][2] Expanded Software Ecosystem and Innovation - The collaboration allows Amazon Linux 2023 (AL2023) to access vetted open source packages from the Extra Packages for Enterprise Linux (EPEL) repository, tailored for enterprise needs, thus broadening functionality and customization for users [2] - SUSE's expertise in repackaging and securing these components enables customers to focus on innovation rather than package maintenance [2] Accelerated Time-to-Value and Cost Efficiency - The partnership brings SUSE's expertise in maintaining and securing open source packages to AWS customers, lowering Total Cost of Ownership (TCO) and improving operational agility for complex enterprise software deployments [3] Compliance and Risk Management - Customers developing products on Amazon Linux 2023 can utilize SUSE's enterprise Linux capabilities, which is particularly beneficial for clients in regulated markets, simplifying compliance and reducing risk [4]
Backblaze to Sponsor KubeCon + CloudNativeCon to Showcase High-Throughput Cloud Agnostic Storage
Businesswire· 2025-10-28 10:01
Core Insights - Backblaze is committed to the open cloud movement and will participate as a Silver sponsor at KubeCon + CloudNativeCon North America in Atlanta from November 10-13 [1][2] - The conference is expected to attract over 9,000 attendees and focuses on cloud-native developments, particularly in AI applications, allowing Backblaze to showcase its high-throughput cloud object storage [2][4] Company Highlights - Backblaze provides a modern alternative to traditional cloud providers, emphasizing fast, affordable, and independent cloud storage solutions [3] - The company serves over 500,000 customers across 175 countries, offering high-performance, secure cloud object storage for various applications including AI workflows and media management [4] Industry Context - Kubernetes is utilized by 70% of enterprise companies operating in the cloud, highlighting the significance of cloud-native technologies in the current market [2] - The conference will feature a dedicated learning track on AI and machine learning advancements in cloud-native settings, indicating a growing trend in the integration of AI with cloud infrastructure [4]
Agilysys(AGYS) - 2026 Q2 - Earnings Call Transcript
2025-10-27 21:30
Financial Data and Key Metrics Changes - Fiscal 2026 Q2 revenue reached a record $79.3 million, a 16.1% increase from $68.3 million in the prior year [16][27][30] - Year-to-date revenue for FY 2026 is $156 million, up 18.4% compared to the prior year [17][28] - Recurring revenue grew 23% year-over-year to a record $51 million, driven mainly by a 33.1% increase in subscription revenue [17][30] - Adjusted net income for Q2 was $11.7 million, compared to $4.1 million in the prior year [31] Business Line Data and Key Metrics Changes - Subscription sales were up 59%, with foodservice management (FSM) sales increasing more than 2.5 times [10][17] - Point of Sale (POS) products, including add-on modules, were up 23%, while Property Management Systems (PMS) products increased by 34% [10][17] - International sales grew by 36%, with notable wins in the gaming and casino sectors [10][12] Market Data and Key Metrics Changes - International sales levels had another strong quarter, growing by more than 35% over the prior year [12] - The company added 18 new customers in Q2, all of which were subscription-based sales agreements [15] - The total addressable market is significantly larger than the company's current size, indicating substantial growth potential [12][36] Company Strategy and Development Direction - The company is focused on creating a comprehensive ecosystem of cloud-native hospitality software solutions, which has provided competitive advantages [7][9] - AI is increasingly being integrated into the product offerings, enhancing innovation and operational efficiencies [24][26] - The company aims to maintain disciplined, profitable growth while expanding its presence in the hospitality technology market [36] Management's Comments on Operating Environment and Future Outlook - Management expressed optimism about the current business momentum and the ongoing surge in subscription software sales [34] - The company is experiencing increased interest from major hospitality corporations, indicating a positive shift in market perception [36] - The guidance for full-year revenue has been raised to $315 million to $318 million, with subscription revenue growth expectations increased to 29% [21][33] Other Important Information - The product backlog improved significantly, ending Q2 at 49% higher than Q1, providing better visibility for the rest of the fiscal year [19][28] - The company is debt-free after paying down its credit revolver by $24 million in the first half of the fiscal year [32] Q&A Session Summary Question: What has changed regarding record sales momentum? - Management attributed the sales momentum to improvements in the product ecosystem and increased senior talent, rather than just market adoption of cloud solutions [39][40] Question: Will the Marriott rollout impact margins? - Management expects the Marriott rollout to be margin accretive over time, with potential short-term investments [42][43] Question: How is international performance compared across regions? - Management noted strong performance in EMEA, particularly in the UK, and emphasized that product improvements are driving international growth [46][48] Question: What is the current capacity for service delivery? - Management confirmed that capacity improvements in services have been completed, allowing for better handling of backlog [51][53] Question: Are larger hospitality players taking notice of the company? - Management confirmed increased attention from larger players, attributing this to product improvements and enhanced sales capabilities [68][70] Question: Did the guidance increase relate to the Marriott rollout? - Management clarified that the guidance increase was not related to the Marriott project but rather due to general sales momentum [85][86]
2025运维人咋突围?3个转型技能+1个实用认证,告别背锅涨工资
Sou Hu Cai Jing· 2025-10-12 22:35
Core Insights - The demand for traditional operations and maintenance (O&M) positions has decreased by 18% in 2024, while salaries for O&M roles with cross-domain skills have increased by 25% [1] - By 2025, O&M roles that only involve basic tasks like inspection and log checking are likely to be replaced by automation tools, emphasizing the importance of skill selection over mere activity [1][3] Industry Trends - The nature of O&M has evolved from merely monitoring servers to managing cloud-native architectures, requiring a shift from physical machine management to container and microservices management [3] - The automation rate for basic monitoring and troubleshooting tasks has exceeded 70%, leading to a decline in the value of O&M roles that only perform execution tasks [3] Skills Development - Three essential skills for O&M professionals to thrive in 2025 include: 1. **Cloud Native and Container Orchestration**: Over 90% of enterprises are expected to host their business on the cloud, making knowledge of Docker image management highly sought after across various industries [6] 2. **Data Operations and Analysis**: O&M professionals must learn to predict issues by analyzing system logs and business data, which can help in resource optimization and decision-making [7] 3. **AI Operations Practice**: Proficiency in AI tools can enhance efficiency, allowing O&M professionals to automatically identify anomalies and reduce troubleshooting time by over 60% [7] Transformation Steps - The transformation process for O&M professionals involves three steps: 1. **Self-Assessment**: Identifying skill gaps and defining career transition goals [11] 2. **Phased Learning**: Prioritizing the learning of cloud-native and data operations before moving on to AI operations, applying new skills in real work scenarios [11] 3. **Gaining Cross-Industry Experience**: Seeking opportunities to participate in cross-industry projects or engaging with case studies through CAIE certification communities to enhance resumes [11]
Meet DDN Infinia The Platform for End to End AI
DDN· 2025-09-18 19:04
Infinia Platform Overview - Infinia is a software-defined, metadata-driven, containerized, cloud-native data intelligence platform designed for scalability, performance, and efficiency across core, cloud, and edge environments [1] - The platform supports critical data protocols like object and block, integrating with AI data acceleration libraries like TensorFlow and PyTorch [1] - Infinia enhances AI execution engines by serving data in its native form, reducing the need for data conversion and speeding up applications [1] Metadata and Multi-Tenancy Capabilities - Infinia allows for tagging massive amounts of metadata to objects, enabling faster data discovery and processing, with no limitations on metadata capability [1] - The platform has built-in multi-tenancy capabilities, providing SLAs for individual tenants and sub-tenants on capacity and performance, ensuring quality of service [1] Scalability and Cloud Native Design - Infinia is fully containerized, allowing for scale-out at web scale, starting from a few terabytes and scaling to exabytes [1] - The product is designed to be cloud-native and will soon be available in leading cloud provider marketplaces [2] AI Data Challenges and Solutions - Infinia addresses the complexity of managing large amounts of distributed multimodal data across core, cloud, and edge environments by creating a unified platform [1] - It tackles the demand for extremely low latency required to run AI applications, as well as the high costs associated with running AI [1] - The platform ensures data protection at any time and at any scale [1] Performance Metrics - Infinia can deliver time to first byte in less than a microsecond [2] - It can deliver 30 to 40 million objects per second in list object operations [2] - Infinia can deliver terabytes per second throughput at large scale [2] Efficiency and Sustainability - Infinia can achieve 10x data reduction, fitting over 100 petabytes of storage into a single rack [2] - It can reduce the overall data center footprint by a quarter compared to competitors, saving 10x power and cooling costs [2] Security Features - Infinia focuses on security authentication and access control, preventing unauthorized data access [2] - Data is always encrypted, and all actions within the system are audited [2] - The platform provides 99.9999% uptime enabled by reliability-focused features [2] Key Business Outcomes - Infinia aims to reduce complexity and achieve more accurate results on a unified platform for AI inference, data analytics, and model training [2] - It accelerates innovation by running AI apps faster, enabling businesses to beat the competition [2] - The platform enables rapid deployment across the cloud core and the edge to increase productivity, boost efficiency, and maximize ROI [2]
云原生工程师(包更新)
Sou Hu Cai Jing· 2025-08-19 14:22
Group 1 - The core viewpoint emphasizes that cloud-native and microservices architecture are central to enterprise technology upgrades amid digital transformation [2][3] - The emergence of the "Mago Cloud Native Microservices Governance Sprint Class" reflects a deep transformation in technical education and reveals a new interaction logic among technology, economy, and talent market [2] - Traditional IT training is undergoing a paradigm shift, focusing on reshaping architectural thinking rather than just skill training [2] Group 2 - The course integrates tools like Kubernetes and Istio with microservices governance methodologies, enabling students to transition from merely coding to mastering distributed systems [2] - The curriculum addresses industry pain points, with governance capabilities such as service discovery and circuit breaking becoming essential as companies migrate from monolithic to microservices architecture [3] - The economic effect shows that the cloud computing industry has surpassed a trillion in scale, with programmers skilled in cloud-native technologies enjoying a 40% salary premium [3] Group 3 - The course's continuous update mechanism reflects the necessity of lifelong learning for technical professionals, especially in the face of AI's impact on traditional programming roles [3] - The transformation to cloud-native skills has allowed professionals to move from passive roles to core architecture teams, enhancing their job security [3] - The upgrade in technical capabilities driven by high-quality courses signifies a reconstruction of production relationships in the digital age, contributing to a more efficient and resilient technological ecosystem [3]