Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.knowledgestack.ai/llms.txt

Use this file to discover all available pages before exploring further.

Five short videos. Each one is under three minutes and shows a single capability end-to-end on real data. No slides.
Want a guided walkthrough on your own corpus? Book a 30-minute demo with a founding engineer.

90-second product tour

The fastest way to understand what Knowledge Stack does — ingest a folder of PDFs, run a permission-scoped search, ask a question, and follow a citation back to the source page.

Ingestion pipeline in action

Watch a 200-page report flow through the Temporal-backed ingestion worker — conversion, chunking, embedding, and the resulting tree of PathPart nodes you can search against.
Read the deep dive: Ingestion Pipeline · Temporal workflow · Chunk handling.

Plug into your agent framework

Same MCP server, three different hosts — LangGraph, the OpenAI Agents SDK, and Claude Desktop — each running a permission-scoped retrieval call against the same Knowledge Stack tenant.
More patterns: MCP server · Cookbook flagships.

Citations that survive an audit

Every assistant claim links back to a chunk UUID, a page number, and a bounding box. Click a citation, jump to the highlighted region of the original PDF.
How it works: Citations · Threads & streaming.

Permission-aware retrieval

Two users. Same query. Different results — by construction, not by post-filtering. We show how path-level grants flow through search and chat.
Concept docs: Path system · Authorization.

Where to next

Why Knowledge Stack

Positioning, differentiators, and where we fit in your stack.

Quickstart

First ingest + search call in under five minutes.

Book a demo

30 minutes with a founding engineer on your own data.