Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.knowledgestack.ai/llms.txt

Use this file to discover all available pages before exploring further.

Overview

This section covers how to deploy Knowledge Stack, whether you are running it locally for development or deploying to production.

Local Development

See the Quickstart guide for getting started locally.

Required Services

Knowledge Stack depends on several infrastructure services. In development, these are started automatically via Docker Compose:
ServiceDefault PortDescription
PostgreSQL (TimescaleDB)5432Primary database with vector search support
Temporal7233Workflow orchestration for document ingestion and agents
Temporal UI8080Web interface for monitoring workflows
MinIO9000 / 9001S3-compatible object storage for file uploads
Nginx15173 / 18000HTTPS reverse proxy for frontend and API

Starting the Development Stack

make dev-stack       # Start all infrastructure services
make dev-api         # Start the API server
make dev-worker      # Start the ingestion worker
The API will be available at http://localhost:8000 (or https://localhost:18000 via the Nginx proxy), with interactive API documentation at /api/docs.

Environment Configuration

FilePurpose
.env.devDevelopment environment variables
.env.secretsAPI keys and secrets (see .env.secrets.example for required keys)

Production Deployment

Building Production Images

make build-api       # Build the API server image
make build-worker    # Build the ingestion worker image

CI/CD

Knowledge Stack includes automated CI/CD pipelines for testing, building, and deploying. See the CI/CD documentation for details.