Meta 系 95 后华人明星团队,创业一年就与高通达成合作,让手机拥有多模态记忆

Core Insights - Memories.ai, founded by Shawn Shen, has launched LVMM 2.0, a Large Visual Memory Model, and announced a partnership with Qualcomm for native operation on Qualcomm processors by 2026 [2][9] - The company focuses on developing AI's visual memory capabilities, having completed a $8 million seed round in July 2025, led by Susa Ventures with participation from notable investors like Samsung Next and Fusion Fund [2] Company Background - Founder Shawn Shen holds a PhD in Engineering from Trinity College, Cambridge, and previously worked as a core research scientist at Meta Reality Labs, focusing on human-computer interaction and augmented reality [3] - Co-founder Ben Zhou also has experience at Meta Reality Labs, working on AI assistants for Meta's Ray-Ban glasses [3] - Eddy Wu has been appointed as the Chief AI Officer, bringing five years of experience from Meta, where he was involved in GenAI research [3] Product Development - LVMM 2.0 was released three months after the first generation, maintaining performance while reducing parameter count by 90%, making it more suitable for edge devices [6] - The model converts raw video into structured memory on-device, addressing video searchability issues by encoding and compressing frames to create an index for millisecond retrieval [7][8] Technical Advantages - Running on Qualcomm processors significantly reduces latency, lowers cloud costs, and enhances data localization for improved security [8] - The model integrates video, audio, and images to provide contextual results, ensuring a consistent experience across devices like smartphones and cameras [8] Applications - Practical applications of LVMM 2.0 include enhancing AI capabilities in smart glasses, security systems, and robots, enabling real-time understanding and response [11]