Flowise (LangChain Visual Builder)
Flowise is a drag-and-drop UI for building LangChain-powered RAG pipelines, chatbots, and AI agent flows — no Python required.
Aithroyz pre-configures Flowise with the LLM Gateway as the default AI provider (when deployed together), so model credentials are injected automatically — you can start building flows without touching API keys.
Access
URL:
https://flowise.<env-name>.ops.aithroyz.comCredentials: Admin username and password shown in Environments detail → Credentials panel.
What's pre-configured
LLM Gateway credential
When LLM Gateway is in the same plan, an OpenAI-compatible credential is pre-seeded pointing to the internal gateway — no key entry needed
Admin credentials auto-set
The Flowise admin username and password are generated at deploy time and stored in the Credentials panel
Persistent storage volume
Chatflows, tools, and uploaded files are stored on a persistent disk so they survive container restarts
Building a chatflow
Flowise chatflows are visual pipelines where each node is a LangChain component. To build a basic Q&A bot:
1
Open Chatflows
Click "Add New" to open the canvas editor.
2
Add a Chat Model node
Drag a ChatOpenAI node onto the canvas. If the LLM Gateway credential is pre-seeded, select it from the dropdown — no base URL changes needed.
3
Add a Conversational Chain
Drag a Conversational Retrieval QA Chain (or Conversation Chain for simple chat) and connect the Chat Model output to its LLM input.
4
Optionally add a Vector Store
For RAG, add a Qdrant node (URL: http://10.0.0.36:6333) and connect it to the chain's Vector Store input.
5
Save and test
Click Save, then open the chat bubble in the top-right to test inline. The chatflow is now live at its API endpoint.
6
Use the API endpoint
Copy the endpoint from Chatflows → the three-dot menu → "API Endpoint". Integrate it with any frontend using a POST request.
Connecting to Qdrant
When Qdrant is deployed in the same plan, use its internal address to avoid going through the public gateway:
Qdrant node settings:
Base URL: http://10.0.0.36:6333
Collection: my-docs # create via Qdrant dashboard first
Embedding: (connect an Embeddings node — use the LLM Gateway credential)✓
Using the internal IP (10.0.0.36) keeps embedding traffic inside the VPC and avoids TLS overhead. Only use the public subdomain when accessing Qdrant from outside the sandbox.
Tips
Export flows as JSON
Use the export button on any chatflow for version control. Re-import with "Add New" → import JSON. Great for sharing flows between environments.
Streaming responses
Append ?streaming=true to the chatflow API endpoint URL to receive Server-Sent Events instead of a single JSON response — useful for chat UIs.
Tool agents
The OpenAI Function Agent node supports custom tools (HTTP nodes, code nodes). Build autonomous agents that call external APIs or query Qdrant.
Document loaders
Flowise includes loaders for PDF, DOCX, web pages, GitHub repos, and Notion — connect any to a Text Splitter → Embeddings → Qdrant chain.