Workflow
LLM context length
icon
Search documents
X @Avi Chawla
Avi Chawla· 2025-08-23 19:32
LLM Context Length Growth - GPT-3.5-turbo 的上下文长度为 4k tokens [1] - OpenAI GPT4 的上下文长度为 8k tokens [1] - Claude 2 的上下文长度为 100k tokens [1] - Llama 3 的上下文长度为 128k tokens [1] - Gemini 的上下文长度达到 1M tokens [1]
X @Avi Chawla
Avi Chawla· 2025-08-23 06:30
That's a wrap!If you found it insightful, reshare it with your network.Find me → @_avichawlaEvery day, I share tutorials and insights on DS, ML, LLMs, and RAGs.Avi Chawla (@_avichawla):The growth of LLM context length with time:- GPT-3.5-turbo → 4k tokens- OpenAI GPT4 → 8k tokens- Claude 2 → 100k tokens- Llama 3 → 128k tokens- Gemini → 1M tokensLet's understand how they extend the context length of LLMs: ...
X @Avi Chawla
Avi Chawla· 2025-08-23 06:30
LLM Context Length Growth - The industry has witnessed a significant expansion in LLM context length over time [1] - GPT-3.5-turbo initially supported 4k tokens [1] - OpenAI GPT4 extended the limit to 8k tokens [1] - Claude 2 further increased the context length to 100k tokens [1] - Llama 3 achieved a context length of 128k tokens [1] - Gemini reached an impressive 1M tokens [1]