Obtrace Docs
Intention-first documentation for setup, incident investigation, and safe remediation workflows.
Obtrace Docs
Integrating with AI? Use
/llms.txtfor the standard machine-readable context (llmstxt.org), or connect via MCP for live tool access.
Use these docs based on what you need to do now. The documentation is organized to guide complete workflows, not to make you assemble steps from scattered reference pages.
AI-first entry points
If you want to use Obtrace Docs as context for assistants and MCP clients, start here:
/llms.txt— machine-readable authority index following the llms.txt standard. Feed this to any LLM for grounded answers about Obtrace.- MCP Server — connect Claude, Cursor, or custom agents to query your live Obtrace data.
- Use the floating Ask AI button for contextual answers inside the docs UI.
- Download mcp.json for MCP resource discovery.
- Open SDK Downloads and AI Context for the canonical file list.
Choose your path
I need to instrument my first service
Follow this end-to-end path:
I need to investigate a production incident
Follow this diagnostic path:
- Investigate your first incident
- How Obtrace detects incidents
- How root cause analysis works
- Production error library
I need to enable remediation pull requests safely
Follow this governance path:
What each section is for
Quickstart: fastest path to real signal in productionWorkflows: complete guided flows by intentionConcepts: mental model behind detection, RCA, and remediationSDKs: language-specific implementation detailsEnvironments: runtime-specific deployment guidanceSecurity: repository permissions and approval policiesAPI Reference: contracts and integration details
If you are under incident pressure
Start with: