context length
Search documents
X @Avi Chawla
Avi Chawla· 2025-12-07 19:14
Model Training & Context Expansion - Fine-tuning on longer documents with 128K context is an insufficient response in a Research Scientist interview at OpenAI [1] - The question focuses on expanding the context length of an LLM from 2K to 128K tokens [1]
X @Avi Chawla
Avi Chawla· 2025-12-07 06:42
You're in a Research Scientist interview at OpenAI.The interviewer asks:"How would you expand the context length of an LLM from 2K to 128K tokens?"You: "I will fine-tune the model on longer docs with 128K context."Interview over.Here's what you missed: ...