Documentation Index
Fetch the complete documentation index at: https://docs.knowledgestack.ai/llms.txt
Use this file to discover all available pages before exploring further.
Overview
This section covers how to deploy Knowledge Stack, whether you are running it locally for development or deploying to production.Local Development
See the Quickstart guide for getting started locally.Required Services
Knowledge Stack depends on several infrastructure services. In development, these are started automatically via Docker Compose:| Service | Default Port | Description |
|---|---|---|
| PostgreSQL (TimescaleDB) | 5432 | Primary database with vector search support |
| Temporal | 7233 | Workflow orchestration for document ingestion and agents |
| Temporal UI | 8080 | Web interface for monitoring workflows |
| MinIO | 9000 / 9001 | S3-compatible object storage for file uploads |
| Nginx | 15173 / 18000 | HTTPS reverse proxy for frontend and API |
Starting the Development Stack
http://localhost:8000 (or https://localhost:18000 via the Nginx proxy), with interactive API documentation at /api/docs.
Environment Configuration
| File | Purpose |
|---|---|
.env.dev | Development environment variables |
.env.secrets | API keys and secrets (see .env.secrets.example for required keys) |
Production Deployment
- Docker Deployment — Deploy Knowledge Stack using Docker Compose
- LiteLLM Deployment — Deploy the LLM gateway for per-tenant cost tracking
