Inside Meta's AI chip lab
Meta PlatformsMeta Platforms(US:META) Youtube·2026-03-11 14:35

Core Insights - Meta is developing its in-house AI chip program, the Meta Training and Inference Accelerator (MTIA), to create efficient architectures for internal workloads [1] - The company plans to release four new generations of chips over the next two years, focusing on applications from ranking and recommendations to large-scale generative AI inference [2] - The MTIA 300 chip is already in production, supporting training for ranking and recommendations, while the MTIA 400 is moving towards deployment for broader AI workloads [2][3] - Future chip versions, MTIA 450 and 500, are set to enhance generative AI inference capabilities, with deployments anticipated in 2027 [3] - Meta is accelerating its chip design process to keep pace with rapidly evolving AI models, aiming to improve performance, cost, and power efficiency [4] - The company is securing major supply deals with leading chip manufacturers like Nvidia and AMD to ensure substantial AI computing capacity while also developing custom silicon for its unique workloads [4]

Meta Platforms-Inside Meta's AI chip lab - Reportify