Backstage Agent
- 🤖 Backstage Agent is an LLM-powered agent built using the LangGraph ReAct Agent workflow and Backstage MCP Server.
- 🌐 Protocol Support: Compatible with A2A protocol for integration with external user clients.
- 🛡️ Secure by Design: Enforces Backstage API token-based RBAC and supports secondary external authentication for strong access control.
- 🏭 MCP Server: The MCP server is generated by our first-party openapi-mcp-codegen utility, ensuring version/API compatibility and software supply chain integrity.
- 🔌 MCP Tools: Uses langchain-mcp-adapters to glue the tools from Backstage MCP server to LangGraph ReAct Agent Graph.
🏗️ Architecture
System Diagram
Sequence Diagram
⚙️ Local Development Setup
Use this setup to test the agent against a local Backstage instance.
▶️ Start Backstage with Docker
Note: Backstage can be run locally using Docker for development and testing.
# Clone Backstage
git clone https://github.com/backstage/backstage.git
cd backstage
# Start Backstage with Docker Compose
docker-compose up -d
🛂 Retrieve Admin Credentials
# Get the admin user token from Backstage logs
docker-compose logs backstage | grep "admin user token"
📦 Install CLI (Optional)
# Install Backstage CLI
npm install -g @backstage/cli
🚀 Deploy Example Service
# Create a new service using Backstage CLI
npx @backstage/cli create
# Follow the prompts to create a service
🔑 Get API Token
- Log in to your Backstage instance
- Go to Settings → API Access
- Create a new API token with appropriate permissions
- Save the token for your
.env
file
Add to your .env
:
BACKSTAGE_API_KEY=<your_token>
BACKSTAGE_API_URL=http://localhost:7007
Local Development
# Navigate to the Backstage agent directory
cd ai_platform_engineering/agents/backstage
# Run the MCP server in stdio mode
make run-a2a