Hosted MCP (Model Context Protocol) server
The docs site exposes a Model Context Protocol server at:One-click install
Each page has a contextual menu (top-right of the page) with install buttons for common clients — Claude, Cursor, VS Code, and ChatGPT. Use those for the shortest path.Manual install
llms.txt
The documentation provides a static, machine-friendly index at:/llms.txt— short index of all pages/llms-full.txt— full-content bundle for ingestion
Per-page AI actions
The page-level contextual menu also offers:- Copy page — copies the current page as Markdown for pasting into chats
- View as Markdown — opens the raw Markdown source in a new tab
- Ask ChatGPT / Claude / Perplexity — opens the respective assistant pre-loaded with the page as context
Best practices
- Prefer the MCP server when you want the assistant to reason across multiple pages and keep answers current.
- Prefer
llms-full.txtfor offline tooling, evaluations, or custom RAG (retrieval-augmented generation) pipelines. - Use the per-page actions for quick spot questions about the page you are already reading.

