LlamaIndex
Search documents
LlamaSheets | AI Parsing and Extraction for Spreadsheets
LlamaIndex· 2025-11-25 04:07
So today we're launching a new API in Llama Cloud that we're calling Llama Sheets. And this is a way to take your spreadsheets that have tables and data all over the place and turn them into structured parquet files that are ready to be processed by your AI agents. Um, and you can think of a paret file just like a dataf frame.It's a structured format that AI agents and coding agents can readily interact with. Already today, Llama Cloud is a leading platform for building document agents. We are used at a ton ...
From Documents to Insights Integrating LlamaParse with MongoDB for Scalable AI Pipelines
LlamaIndex· 2025-10-31 01:43
Discover how to build a scalable, real-time document processing pipeline that transforms PDFs, reports, and contracts into searchable, enriched data. Learn how LlamaParse powers intelligent parsing and chunking, while MongoDB enables flexible storage, indexing, and vector search. ...
Vibe Coding and Deploying LlamaIndex Agent Workflows with Claude Code
LlamaIndex· 2025-10-29 14:11
Hi there, Clea from Lamindex here and today we're going to v code a lama index workflow using cloud code cloud. mmd and cloud skills. And once we coded the llama index workflow, we're going to package it as a llama agent and deploy it to the cloud.So let's start by creating a u directory called financial documents classification. And once we created this directory, we're going to initialize it as a UV project. And we're going to remove the main.py file since we do not need it. And we're going to add uh VBLA ...
Build and deploy LlamaAgents from scratch
LlamaIndex· 2025-10-22 18:19
Hi everyone, Cleia here from Llama Index and today I'm going to walk you through building a Llama agent from the grounds up. So first of all, what is a llama agent. A llama agent is basically uh our brand new product which is currently in private alpha that lets you build and deploy workflows both as a backend and as a fullstack service on our cloud platform Lama cloud.more of this later when we will actually see a llama agent in action. So first of all u let's understand what we want to build here. So we w ...
AI in 60s: How Dhruv Shah from Carlyle builds and ships AI products for investment research teams
LlamaIndex· 2025-10-16 05:54
AI Development Focus - Carlile Group focuses on investment research agents that scan analyst notes and company data [1] - The company also develops AI assistants that answer ad-hoc queries over structured and unstructured data [1] - The company emphasizes the importance of measuring and scoping AI agents for effectiveness [1] AI in Investment - AI streamlines routine workflows for investment and support teams, boosting efficiency [2] - AI frees up people's time to focus on higher value analysis [2] - The industry should pick the right framework that keeps pace with the fast-moving AI and focuses on business logic [2]
Orchestrating Microservices Using LlamaIndex Workflows
LlamaIndex· 2025-10-13 17:01
Architecture and Workflow - Llama Index introduces a microservices architecture demo for e-commerce, breaking down a monolith into containerized services for specific jobs like front end, authentication, payments, orders, and stock management [1][2] - The demo uses Docker Compose for easy setup, requiring cloning the repository and running `docker compose app -d` [3] - The application structure includes front end, authentication, payment processing, orders, stock management, PostgreSQL for database, Kafka for event-driven communication, and Zookeeper for Kafka management [3] - Workflows are used for payments, orders, and stock management, triggered by events in Kafka, starting with data processing and syncing [3] - The workflow extracts structured data (order, payment, stock, item) from raw JSON data and builds a query to update the database [3] - The workflow ends by restituting the operation status (success or failure) and sending details back to Kafka [4] Data Flow and Communication - The front end sends orders as JSON data to a Kafka topic, which triggers workflows for payment, order creation, and stock update [4] - Each service (payments, orders, stock) subscribes to the orders topic and publishes operation statuses to separate partitions in the Kafka topic [4] - Payments write to partition zero, orders to partition one, and stock to partition two [5] - Operation statuses are streamed back to the front end, indicating success or failure of each operation [5] Demonstration and Potential Issues - The demo can be run locally by registering a user and placing an order, which triggers the described pipeline [7][8] - Failures in updating the order, placing the order, payment, and updating the stock may occur due to disturbances in communication [10]
Create and Deploy a Custom Extraction Agent
LlamaIndex· 2025-10-09 16:14
Key Functionality & Customization - Llama Index provides a customizable extraction review template for specific data like receipts and invoices [1] - The template allows users to focus on schema development rather than building scaffolding [2] - Users can update front-end schemas to reflect changes in the data extraction process [3] - The system flags fields with low confidence, enabling efficient review and validation [4] - Users can deploy custom extraction agents with validation on Llama Cloud [6] Deployment & Data Handling - The deployed app can have a distinct production dataset compared to the local environment [5] - Users can re-upload data to improve extraction results and achieve valid states [4][6] Application & Validation - The system is designed for messy and hard-to-read documents with multiple line items and taxes [1] - The custom extraction agent includes validation to ensure accuracy [6]
Create and Deploy a Custom Extraction Agent
LlamaIndex· 2025-10-09 15:57
Product Overview - Llama Index introduces a customizable extraction review template for specific data types, focusing on receipts and invoices [1] - The template aims to streamline the extraction process from messy and varied line items, including taxes [1] - The solution includes a validation feature to ensure data accuracy [6] - The custom extraction agent can be deployed on Llama Cloud [6] Schema Development & Customization - Schema development is a key aspect, and the template helps users focus on this task [2] - The system allows for updating front-end schemas and re-uploading data to reflect changes [3][4] - Users can build custom UI around receipts data for better context [5] Deployment & Validation - The deployed app allows for distinct production data sets [5] - The system highlights fields with low confidence, enabling efficient review [4] - Validation ensures the extracted data is accurate [6]
LlamaAgents Open for Private Alpha 🎉
LlamaIndex· 2025-10-01 15:27
Today we're opening Llama agents to early access. And in this video I'm going to walk you through how you can get started. Um at the end of the video, what I want to achieve is an agent that's been deployed to our Llama Cloud project.Um and to start off with that, what we first need to install is Llama Control. So you can start with UV install Llama control. I already have it.And once you have that installed, you can say llamactl in it, which will spin up a CLI that allows us to decide what kind of llama ag ...
LlamaAgents Open for Private Alpha 🎉
LlamaIndex· 2025-10-01 04:11
Product Overview - Llama Cloud 的 Agents 产品旨在帮助客户快速构建、部署和扩展文档工作流程 [1] - 该产品旨在加速原型到生产的转换,并支持多种部署类型 [1] - 提供多种 Agent 模板,方便用户快速开始特定 Agent 的使用 [2] Deployment and Workflow - 用户可以通过 Llama Control 下载并初始化模板,根据需求进行编辑和配置 [2][4] - 可以将代码存储库部署到 GitHub,并通过 Llama Control 进行部署创建 [4][5][6] - 部署过程需要 API 密钥,用户可以在 Llama Cloud 中创建 [6] - 部署完成后,应用可以在 Llama Cloud 的 Agents 界面中访问,并可在 staging 或 production 环境中运行 [9][10] Functionality and Access - UI 界面可用于提取和修正信息 [11] - 提供 headless 部署选项,允许用户通过 API 访问工作流程,并构建自定义 UI [11]