Run with Docker Compose
Set up CAIPE on a laptop or VM (e.g. EC2) using Docker Compose.
Prerequisites
-
Clone the repository
git clone https://github.com/cnoe-io/ai-platform-engineering.git
cd ai-platform-engineering -
Configure environment variables
cp .env.example .envEdit
.envwith your configuration. Minimal example:########### CAIPE Agent Configuration ###########
# Enable the agents you want to deploy
ENABLE_GITHUB=true
# A2A transport configuration (p2p or slim)
A2A_TRANSPORT=p2p
# MCP mode configuration (http or stdio)
MCP_MODE=http
# LLM provider (anthropic-claude, aws-bedrock, openai, azure-openai)
LLM_PROVIDER=anthropic-claude
ANTHROPIC_API_KEY=sk-ant-...
########### GitHub Agent Configuration ###########
GITHUB_PERSONAL_ACCESS_TOKEN=<token>For full LLM provider options see Configure LLMs. For agent-specific credentials see Configure Agent Secrets.
-
Configure A2A Authentication (optional)
Option A: OAuth2 (recommended for production)
A2A_AUTH_OAUTH2=true
JWKS_URI=https://your-idp.com/.well-known/jwks.json
AUDIENCE=your-audience
ISSUER=https://your-idp.com
OAUTH2_CLIENT_ID=your-client-idGet a JWT token with:
OAUTH2_CLIENT_SECRET=your-secret \
TOKEN_ENDPOINT=https://your-idp.com/oauth/token \
python ai_platform_engineering/utils/oauth/get_oauth_jwt_token.pyLocal development with Keycloak:
cd deploy/keycloak && docker compose upThen set:
A2A_AUTH_OAUTH2=true
JWKS_URI=http://localhost:7080/realms/caipe/protocol/openid-connect/certs
AUDIENCE=caipe
ISSUER=http://localhost:7080/realms/caipe
OAUTH2_CLIENT_ID=caipe-cli
OAUTH2_CLIENT_SECRET=<from-keycloak>
TOKEN_ENDPOINT=http://localhost:7080/realms/caipe/protocol/openid-connect/tokenKeycloak admin console: http://localhost:7080 (admin / admin). Switch to the
caiperealm and create acaipe-cliclient.Option B: Shared key (development / testing)
A2A_AUTH_SHARED_KEY=your-secret-keyIf neither option is set, the agent runs without authentication — not recommended for production.
Start CAIPE
Use Docker Compose profiles to enable specific agents. If no profile is specified, only the supervisor starts.
Available agent profiles:
| Profile | Description |
|---|---|
argocd | ArgoCD GitOps for Kubernetes deployments |
aws | AWS cloud operations |
backstage | Backstage developer portal |
confluence | Confluence documentation |
github | GitHub repos and pull requests |
jira | Jira issue tracking |
komodor | Komodor Kubernetes troubleshooting |
pagerduty | PagerDuty incident management |
rag | RAG knowledge base (Milvus, Neo4j, Redis) |
slack | Slack messaging |
splunk | Splunk observability |
webex | Webex collaboration |
slim | AGNTCY Slim dataplane (set A2A_TRANSPORT=slim) |
tracing | Langfuse distributed tracing (Clickhouse, Postgres) |
Examples:
# Supervisor only
docker compose up
# Single agent
COMPOSE_PROFILES="github" docker compose up
# Multiple agents
COMPOSE_PROFILES="argocd,aws,backstage" docker compose up
# With RAG knowledge base
COMPOSE_PROFILES="github,rag" docker compose up
# With tracing
COMPOSE_PROFILES="github,tracing" docker compose up
# Full stack: agents + RAG + tracing
COMPOSE_PROFILES="github,rag,tracing" docker compose up
Connect to the agent
Once services are running, connect with the agent chat CLI:
Using Docker (host network):
docker run -it --network=host ghcr.io/cnoe-io/agent-chat-cli:stable
Using uvx:
uvx --no-cache git+https://github.com/cnoe-io/agent-chat-cli.git a2a
Tracing with Langfuse
The tracing profile starts Langfuse v3 (web UI, worker, ClickHouse, Postgres, MinIO).
-
Start with tracing:
COMPOSE_PROFILES="github,tracing" docker compose up -
Open Langfuse at http://localhost:3000, create an account, and copy the API keys.
-
Add to
.envand restart:ENABLE_TRACING=true
LANGFUSE_PUBLIC_KEY=your-public-key
LANGFUSE_SECRET_KEY=your-secret-key
LANGFUSE_HOST=http://langfuse-web:3000
Next steps
- Configure LLMs — LLM provider and API key setup
- Configure Agent Secrets — Agent-specific credentials
- Deploy to Kubernetes — KinD local cluster
- Deploy with Helm — Production Kubernetes deployment