Workflow
Avi Chawla
icon
Search documents
X @Avi Chawla
Avi Chawla· 2025-06-16 06:30
If you found it insightful, reshare it with your network.Find me → @_avichawlaEvery day, I share tutorials and insights on DS, ML, LLMs, and RAGs.Avi Chawla (@_avichawla):A RAG engine for deep document understanding!RAGFlow lets you build enterprise-grade RAG workflows on complex docs with well-founded citations.Supports multimodal data understanding, web search, deep research, etc.100% local & open-source with 55k+ stars! https://t.co/skOHLpd38e ...
X @Avi Chawla
Avi Chawla· 2025-06-16 06:30
GitHub repo: https://t.co/LtSlnL8yxe ...
X @Avi Chawla
Avi Chawla· 2025-06-16 06:30
A RAG engine for deep document understanding!RAGFlow lets you build enterprise-grade RAG workflows on complex docs with well-founded citations.Supports multimodal data understanding, web search, deep research, etc.100% local & open-source with 55k+ stars! https://t.co/skOHLpd38e ...
X @Avi Chawla
Avi Chawla· 2025-06-15 06:47
GitHub repo: https://t.co/niBNHA7boe ...
X @Avi Chawla
Avi Chawla· 2025-06-15 06:43
Product Release - Microsoft open-sourced a no-code data analysis tool [1] - Data Formulator provides AI-powered data analysis [1] - Data Formulator features a drag-and-drop UI for visualization tasks [1] - The tool extends beyond the initial dataset by creating relevant fields and visualizations [1]
X @Avi Chawla
Avi Chawla· 2025-06-14 20:03
Model Architecture - Explains Transformer vs Mixture of Experts (MoE) in LLMs with visuals [1] - Focuses on clearly explaining Mixture of Experts in LLMs [1]
X @Avi Chawla
Avi Chawla· 2025-06-14 06:30
LLM 技术 - Transformer 与 Mixture of Experts (MoE) 在 LLMs 中的对比分析 [1] - 行业关注 DS (数据科学), ML (机器学习), LLMs (大型语言模型), 和 RAGs (检索增强生成) 的教程和见解 [1] 社交媒体互动 - 鼓励用户分享信息 [1] - 行业专家 Avi Chawla 在社交媒体上分享相关内容 [1]
X @Avi Chawla
Avi Chawla· 2025-06-14 06:30
Model Architecture - Mixture of Experts (MoE) models activate only a fraction of their parameters during inference, leading to faster inference [1] - Mixtral 8x7B by MistralAI is a popular MoE-based Large Language Model (LLM) [1] - Llama 4 is another popular MoE-based LLM [1]
X @Avi Chawla
Avi Chawla· 2025-06-14 06:30
LLM Architectures - The report compares Transformer and Mixture of Experts (MoE) architectures in Large Language Models (LLMs) [1] - The report provides clear explanations and visuals to illustrate the differences between the two architectures [1] Focus - The report focuses on explaining Transformer and MoE architectures in LLMs [1]
X @Avi Chawla
Avi Chawla· 2025-06-13 19:11
RT Avi Chawla (@_avichawla)Containerized versions of 450+ MCP servers in a single repo!- No manual setup—just pull the image.- Safe to run in isolated containers, unlike scripts.- Auto-updated daily.Easiest and safest way to use MCP servers with Agents. https://t.co/0dQAn8WBd7 ...