Obtrace Docs

Intention-first documentation for setup, incident investigation, and safe remediation workflows.

Obtrace Docs

Integrating with AI? Use /llms.txt for the standard machine-readable context (llmstxt.org), or connect via MCP for live tool access.

Use these docs based on what you need to do now. The documentation is organized to guide complete workflows, not to make you assemble steps from scattered reference pages.

AI-first entry points

If you want to use Obtrace Docs as context for assistants and MCP clients, start here:

  • /llms.txt — machine-readable authority index following the llms.txt standard. Feed this to any LLM for grounded answers about Obtrace.
  • MCP Server — connect Claude, Cursor, or custom agents to query your live Obtrace data.
  • Use the floating Ask AI button for contextual answers inside the docs UI.
  • Download mcp.json for MCP resource discovery.
  • Open SDK Downloads and AI Context for the canonical file list.

Choose your path

I need to instrument my first service

Follow this end-to-end path:

  1. Quickstart
  2. Instrument your first service
  3. SDK Catalog
  4. Environment Overview

I need to investigate a production incident

Follow this diagnostic path:

  1. Investigate your first incident
  2. How Obtrace detects incidents
  3. How root cause analysis works
  4. Production error library

I need to enable remediation pull requests safely

Follow this governance path:

  1. Enable remediation PRs safely
  2. How AI decides to open a PR
  3. GitHub App Permissions
  4. PR Approval Flow

What each section is for

  • Quickstart: fastest path to real signal in production
  • Workflows: complete guided flows by intention
  • Concepts: mental model behind detection, RCA, and remediation
  • SDKs: language-specific implementation details
  • Environments: runtime-specific deployment guidance
  • Security: repository permissions and approval policies
  • API Reference: contracts and integration details

If you are under incident pressure

Start with:

On this page