Skip to main content

Documentation Index

Fetch the complete documentation index at: https://opensre.com/docs/llms.txt

Use this file to discover all available pages before exploring further.

This guide is for teams running OpenSRE with the local opensre binary on developer machines, VMs, or on-prem hosts.

What mode should I use?

Use the mode that matches where your workloads and alerts live:
  • Local laptop mode for fast setup, testing, and debugging
  • On-prem VM/server mode for shared internal environments
  • Container mode for reproducible deployments in Docker
Environment-specific install steps live under the Installation tab:

Local setup workflow

1) Install the OpenSRE CLI

brew install Tracer-Cloud/opensre/opensre
# or
curl -fsSL https://install.opensre.com | bash

2) Enter the OpenSRE shell

opensre

3) Run onboarding

From inside the OpenSRE shell:
onboard
onboard helps you configure:
  • LLM provider (for investigation reasoning)
  • integration credentials
  • optional communication/reporting integrations

4) Verify integration health

From inside the OpenSRE shell:
integrations verify
To verify one integration:
integrations verify <integration-name>

5) Run your first investigation

From inside the OpenSRE shell:
investigate -i tests/e2e/kubernetes/fixtures/datadog_k8s_alert.json

Integration catalog

The full integration catalog is maintained under the Integrations tab, not the Installation tab. Start here: Then open each provider page for credentials, minimum permissions, and setup examples.