Web-app-flowise Role¶
Description: No description available
Variables¶
author: Kevin Veen-Birkenbach
description: Installs Flowise — a visual builder to create, test, and publish AI workflows (RAG, tools, webhooks).
license: Infinito.Nexus NonCommercial License
license_url: https://s.infinito.nexus/license
company: Kevin Veen-Birkenbach
Consulting & Coaching Solutions https://www.veen.world
galaxy_tags: [‘ai’, ‘llm’, ‘rag’, ‘workflow’, ‘orchestration’, ‘self-hosted’, ‘qdrant’, ‘litellm’, ‘ollama’, ‘flowise’]
repository: https://s.infinito.nexus/code
issue_tracker_url: https://s.infinito.nexus/issues
documentation: https://s.infinito.nexus/code/
logo: {‘class’: ‘fa-solid fa-diagram-project’}
run_after: [‘web-app-keycloak’, ‘web-app-matomo’]
README¶
Flowise¶
Description¶
Flowise is a visual builder for AI workflows. Create, test, and publish chains that combine LLMs, your documents, tools, and vector search—without writing glue code.
Overview¶
Users design flows on a drag-and-drop canvas (LLM, RAG, tools, webhooks), test them interactively, and publish endpoints that applications or bots can call. Flowise works well with local backends such as Ollama (directly or via LiteLLM) and Qdrant for retrieval.
Features¶
No/low-code canvas to build assistants and pipelines
Publish flows as HTTP endpoints for easy integration
Retrieval-augmented generation (RAG) with vector DBs (e.g., Qdrant)
Pluggable model backends via OpenAI-compatible API or direct Ollama
Keep data and prompts on your own infrastructure
Further Resources¶
Flowise — https://flowiseai.com
Qdrant — https://qdrant.tech
LiteLLM — https://www.litellm.ai
Ollama — https://ollama.com