context length
Search documents
X @Avi Chawla
Avi Chawla· 2025-12-07 19:14
RT Avi Chawla (@_avichawla)You're in a Research Scientist interview at OpenAI.The interviewer asks:"How would you expand the context length of an LLM from 2K to 128K tokens?"You: "I will fine-tune the model on longer docs with 128K context."Interview over.Here's what you missed: ...
X @Avi Chawla
Avi Chawla· 2025-12-07 06:42
You're in a Research Scientist interview at OpenAI.The interviewer asks:"How would you expand the context length of an LLM from 2K to 128K tokens?"You: "I will fine-tune the model on longer docs with 128K context."Interview over.Here's what you missed: ...