Workflow
Overparameterization
icon
Search documents
Why GPT-4.5 Failed
Matthew Berman· 2025-07-03 16:04
What went wrong with GPT 4.5%. It was a bet on full scale. We're just going to take all the data. We're going to make this ridiculously big model and we're going to trade it.It is much smarter than 40 and 4.1% to be completely clear. I've I've said it's the first model to make me laugh like cuz it's actually funny. But in general, it's not that useful and it's too slow and it's too expensive.But you have this issue called overparameterization. Uh if you build a neural network and you feed it some data, it w ...