ServicesFlow
Hanzo Flow
Visual AI workflow builder for LLM applications
Hanzo Flow
Hanzo Flow is a visual AI workflow builder (based on Langflow) for designing, prototyping, and deploying LLM-powered applications. Build complex AI pipelines by connecting components visually — from prompts and chains to RAG systems and multi-agent architectures.
Features
- Visual Builder: Drag-and-drop canvas for composing LLM pipelines
- 100+ Components: LLMs, embeddings, vector stores, tools, agents, chains
- Multi-Model Support: OpenAI, Anthropic, Google, Ollama, HuggingFace, and more
- RAG Pipelines: Build retrieval-augmented generation workflows visually
- Agent Systems: Design multi-agent architectures with tool use
- API Export: Every flow becomes a REST API endpoint automatically
- Multi-Tenant: Isolated workspaces per organization via Hanzo IAM
- Python Extensibility: Add custom components in Python
Architecture
┌─────────────────────────────────────────────────────────────────────────┐
│ HANZO FLOW │
├─────────────────────────────────────────────────────────────────────────┤
│ │
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ ┌─────────────┐ │
│ │ React UI │ │ FastAPI │ │ LangChain │ │ Component │ │
│ │ (@xyflow) │ │ Backend │ │ Runtime │ │ Registry │ │
│ │ │ │ │ │ │ │ │ │
│ │ - Canvas │ │ - REST API │ │ - Chains │ │ - LLMs │ │
│ │ - Sidebar │ │ - WebSocket │ │ - Agents │ │ - Vectors │ │
│ │ - Chat │ │ - Auth │ │ - RAG │ │ - Tools │ │
│ │ - Logs │ │ - Storage │ │ - Memory │ │ - Custom │ │
│ └──────────────┘ └──────────────┘ └──────────────┘ └─────────────┘ │
│ │
│ ┌──────────────────────────────────────────────────────────────────┐ │
│ │ Provider Integrations │ │
│ │ OpenAI · Anthropic · Google · Ollama · HuggingFace · Cohere │ │
│ │ Pinecone · Chroma · Weaviate · pgvector · Qdrant │ │
│ └──────────────────────────────────────────────────────────────────┘ │
│ │
└─────────────────────────────────────────────────────────────────────────┘Endpoints
| Environment | URL |
|---|---|
| Production | https://flow.hanzo.ai |
| API Base | https://flow.hanzo.ai/api/v1 |
Quick Start
Build an AI Flow
- Navigate to flow.hanzo.ai
- Sign in with your Hanzo account
- Click New Flow or start from a template
- Drag components from the sidebar onto the canvas
- Connect components by dragging between ports
- Click Run to test your flow
- Use the API tab to get your flow's REST endpoint
Example: RAG Chatbot
Build a retrieval-augmented chatbot in minutes:
- Add components: File Loader → Text Splitter → Embeddings → Vector Store → Retriever → LLM → Chat Output
- Configure: Set your OpenAI/Anthropic API key, upload documents
- Connect: Wire components together visually
- Test: Use the built-in chat interface
- Deploy: Get your API endpoint and integrate
API Access
# Run a flow
curl -X POST https://flow.hanzo.ai/api/v1/run/:flow_id \
-H "Authorization: Bearer $HANZO_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"input_value": "What is Hanzo AI?",
"output_type": "chat",
"input_type": "chat"
}'
# List flows
curl https://flow.hanzo.ai/api/v1/flows \
-H "Authorization: Bearer $HANZO_TOKEN"
# Get flow details
curl https://flow.hanzo.ai/api/v1/flows/:flow_id \
-H "Authorization: Bearer $HANZO_TOKEN"Component Categories
| Category | Examples | Description |
|---|---|---|
| LLMs | OpenAI, Anthropic, Ollama, HuggingFace | Language model providers |
| Embeddings | OpenAI, Cohere, HuggingFace, Sentence Transformers | Text embedding models |
| Vector Stores | Pinecone, Chroma, Weaviate, pgvector, Qdrant | Vector databases |
| Document Loaders | PDF, CSV, Web, GitHub, Notion, Confluence | Data ingestion |
| Text Splitters | Recursive, Character, Token, Semantic | Document chunking |
| Chains | LLM Chain, Sequential, Router, Conversation | LangChain chains |
| Agents | ReAct, OpenAI Functions, Plan-and-Execute | Autonomous agents |
| Tools | Search, Calculator, Python REPL, API Call | Agent tools |
| Memory | Buffer, Summary, Entity, Conversation | Context management |
| Prompts | Template, Few-shot, Chat, System | Prompt engineering |
Multi-Tenant Configuration
Hanzo Flow supports multi-tenant isolation:
- Per-org workspaces: Each organization sees only their flows
- Shared components: Admins can publish org-wide custom components
- API key isolation: Each org manages their own provider API keys
- Usage tracking: Per-org token usage and cost monitoring
Environment Variables
# Core
LANGFLOW_HOST=0.0.0.0
LANGFLOW_PORT=7860
LANGFLOW_AUTO_LOGIN=false
LANGFLOW_CONFIG_DIR=/app/langflow
# Database
LANGFLOW_DATABASE_URL=postgresql://hanzo:[email protected]:5432/langflow
# Auth
LANGFLOW_SUPERUSER=admin
LANGFLOW_SUPERUSER_PASSWORD=<from-kms>
# Store
LANGFLOW_STORE=true
LANGFLOW_STORE_ENVIRONMENT_VARIABLES=trueDevelopment
# Install (Python)
pip install langflow
# Or with uv
uv pip install langflow
# Run locally
langflow run --host 0.0.0.0 --port 7860
# Custom components
# Place .py files in LANGFLOW_CONFIG_DIR/components/Hanzo Auto vs Hanzo Flow
| Feature | Hanzo Auto | Hanzo Flow |
|---|---|---|
| Purpose | Service-to-service automation | AI/LLM pipeline building |
| Upstream | ActivePieces | Langflow |
| Language | TypeScript/Node.js | Python |
| Focus | Triggers, webhooks, integrations | LLM chains, RAG, agents |
| Integrations | 600+ SaaS connectors | 100+ AI/ML components |
| Use Case | "When X happens, do Y" | "Build an AI that does Z" |
Use Auto for connecting business services (Slack + Jira + Salesforce). Use Flow for building AI applications (RAG chatbot, agent pipelines, document processing).
Related
- Hanzo Auto — Workflow automation (ActivePieces fork)
- Hanzo Gateway — LLM API gateway
- Hanzo MCP — Model Context Protocol tools
- Langflow Docs — Upstream documentation
How is this guide?
Last updated on