Skip to main content
You can access Extension.js documentation through two standard AI-accessible surfaces. This lets your editor and assistants answer questions grounded in the real docs instead of stale training data.

Hosted MCP (Model Context Protocol) server

The docs site exposes a Model Context Protocol server at:
https://extensionjs.mintlify.app/mcp
Point any MCP-compatible client at this URL and it gains tools to search the docs, fetch specific pages, and answer questions from current content.

One-click install

Each page has a contextual menu (top-right of the page) with install buttons for common clients — Claude, Cursor, VS Code, and ChatGPT. Use those for the shortest path.

Manual install

{
  "mcpServers": {
    "extensionjs": {
      "url": "https://extensionjs.mintlify.app/mcp"
    }
  }
}
After installing, ask your assistant something like “How do I configure browser-specific manifest fields in Extension.js?” — it consults the MCP tools and cites docs pages.

llms.txt

The documentation provides a static, machine-friendly index at: Use these when you want to feed the docs directly into a retrieval pipeline or summarize them in a custom agent. Extension.js regenerates them on every docs deploy.

Per-page AI actions

The page-level contextual menu also offers:
  • Copy page — copies the current page as Markdown for pasting into chats
  • View as Markdown — opens the raw Markdown source in a new tab
  • Ask ChatGPT / Claude / Perplexity — opens the respective assistant pre-loaded with the page as context

Best practices

  • Prefer the MCP server when you want the assistant to reason across multiple pages and keep answers current.
  • Prefer llms-full.txt for offline tooling, evaluations, or custom RAG (retrieval-augmented generation) pipelines.
  • Use the per-page actions for quick spot questions about the page you are already reading.