Workflow
阿里一口气发7款大模型,这或是最被低估的AI“杀手锏”
Sou Hu Cai Jing·2025-09-24 13:27

Core Insights - Alibaba has launched its most advanced multimodal visual generation model, Tongyi Wanxiang Wan2.5-preview, at the 2025 Hangzhou Yunqi Conference, which features capabilities for audio-visual video generation [2][26] - The model supports various input forms including text, images, and audio, aiming to lower usage barriers across multiple fields such as digital humans, film creation, and remote education [5][26] - The Tongyi Wanxiang model family has generated 3.9 billion images and 70 million videos, making it one of the most widely used visual generation models in China [26][27] Model Features - The Wan2.5-preview model enhances video generation capabilities, increasing video length from 5 seconds to 10 seconds, and supports 1080P HD video generation at 24 frames per second [7][8] - It features improved control over video generation, allowing for complex instructions such as camera movements and character transformations [8][10] - The model utilizes a native multimodal architecture, enabling it to understand and generate across various modalities, thus enhancing cross-modal reasoning and generation capabilities [10][26] User Experience - Users can experience the model through the Alibaba Cloud Bailian platform or the official Tongyi Wanxiang website [6] - The model can generate videos that accurately match character lip movements and background sounds based on simple text prompts [4][11] - Examples demonstrate the model's ability to create realistic scenarios, such as a wedding proposal or a gift unwrapping, with precise audio-visual synchronization [14][16] Market Position - Tongyi Wanxiang is positioned as a leading player in the AI video generation space, comparable to OpenAI's Sora, and is expected to further influence industry trends with the upcoming release of its world model [26][27] - The model family has been open-sourced, with over 20 models released since February, achieving over 30 million downloads, indicating strong community engagement and interest [27]