Workflow
Avi Chawla
icon
Search documents
X @Avi Chawla
Avi Chawla· 2025-06-14 06:30
MoEs have more parameters to load. But a fraction of them are activated during inference. This leads to faster inference.Mixtral 8x7B by @MistralAI and Llama 4 are two popular MoE-based LLMs.Here's the visual again for your reference 👇 https://t.co/NRbNi1Bjyz ...
X @Avi Chawla
Avi Chawla· 2025-06-14 06:30
Transformer vs. Mixture of Experts in LLMs, clearly explained (with visuals): ...
X @Avi Chawla
Avi Chawla· 2025-06-13 19:11
RT Avi Chawla (@_avichawla)Containerized versions of 450+ MCP servers in a single repo!- No manual setup—just pull the image.- Safe to run in isolated containers, unlike scripts.- Auto-updated daily.Easiest and safest way to use MCP servers with Agents. https://t.co/0dQAn8WBd7 ...
X @Avi Chawla
Avi Chawla· 2025-06-13 06:30
If you found it insightful, reshare it with your network.Find me → @_avichawlaEvery day, I share tutorials and insights on DS, ML, LLMs, and RAGs.Avi Chawla (@_avichawla):Containerized versions of 450+ MCP servers in a single repo!- No manual setup—just pull the image.- Safe to run in isolated containers, unlike scripts.- Auto-updated daily.Easiest and safest way to use MCP servers with Agents. https://t.co/0dQAn8WBd7 ...
X @Avi Chawla
Avi Chawla· 2025-06-13 06:30
GitHub repo → https://t.co/Bsl7RYllmn ...
X @Avi Chawla
Avi Chawla· 2025-06-13 06:30
Containerized versions of 450+ MCP servers in a single repo!- No manual setup—just pull the image.- Safe to run in isolated containers, unlike scripts.- Auto-updated daily.Easiest and safest way to use MCP servers with Agents. https://t.co/0dQAn8WBd7 ...
X @Avi Chawla
Avi Chawla· 2025-06-12 19:35
A new ensemble technique that outperforms XGBoost, CatBoost, and LightGBM.Find the details in the explainer thread below: https://t.co/cgyg5Y4U2UAvi Chawla (@_avichawla):ML researchers just built a new ensemble technique.It even outperforms XGBoost, CatBoost, and LightGBM.Here's a complete breakdown (explained visually): ...
X @Avi Chawla
Avi Chawla· 2025-06-12 06:30
If you found it insightful, reshare it with your network.Find me → @_avichawlaEvery day, I share tutorials and insights on DS, ML, LLMs, and RAGs.Avi Chawla (@_avichawla):ML researchers just built a new ensemble technique.It even outperforms XGBoost, CatBoost, and LightGBM.Here's a complete breakdown (explained visually): ...
X @Avi Chawla
Avi Chawla· 2025-06-12 06:30
Here's the visual again for your reference:To recap, instead of training 32 (or K) separate MLPs, TabM uses one shared model and a lightweight adapter layer.Check this visual 👇 https://t.co/lF0yc2UjBb ...
X @Avi Chawla
Avi Chawla· 2025-06-12 06:30
ML researchers just built a new ensemble technique.It even outperforms XGBoost, CatBoost, and LightGBM.Here's a complete breakdown (explained visually): ...