Model Architecture & Performance - Next version of FSD (version 14) is expected to have 10x the number of parameters compared to version 13, suggesting a larger, more expressive model [1] - An increase in parameters (weights and biases) implies a more nuanced model with more "decision" points [2] - A bigger model generally performs better but requires more compute for both training and inference [2] - The decision to increase the model size by 10x suggests confidence in running it on Tesla's HW4, indicating expected performance for inference [3] Technical Implications - Neural networks consist of layers containing nodes (neurons), with connections between nodes having numerical values called weights [1] - Each node also has a bias, which adjusts the node's output independently of its inputs [1] - Parameters in a neural network are the sum of weights and biases [2]
X @Elon Musk
Elon Muskยท2025-07-25 09:52