Skip to content

cloud-bulldozer/orion-mcp

Repository files navigation

Orion MCP

License Python Version
Orion MCP is a Model Context Protocol (MCP) server for performance regression analysis powered by the cloud-bulldozer/orion library.


Key Features

  • Regression Detection – Automatically detects performance regressions in OpenShift & Kubernetes clusters.
  • Interactive MCP API – Exposes a set of composable tools & resources that can be consumed via HTTP or by other MCP agents.
  • Visual Reporting – Generates publication-ready plots (PNG/JPEG) for trends, multi-version comparisons and metric correlations.
  • Container-first – Ships with a lightweight OCI image and an example OpenShift deployment manifest.

Table of Contents

  1. Getting Started
  2. Quick Start
  3. Available Tools
  4. Deployment
  5. Development
  6. Contributing
  7. License

Getting Started

Prerequisites

  • Python 3.11 or newer
  • An OpenSearch (or Elasticsearch ≥7.17) endpoint with Orion-indexed benchmark results
  • Podman or Docker (optional – for containerised execution)

Installation (virtual-env)

# Clone repository
$ git clone https://github.com/YOUR_ORG/orion-mcp.git && cd orion-mcp

# Create & activate a virtual environment
$ python3.11 -m venv .venv
$ source .venv/bin/activate

# Install Python dependencies
$ pip install -r requirements.txt

Quick Start

Set the data-source endpoint and launch the server locally:

export ES_SERVER="https://opensearch.example.com:9200"
python orion_mcp.py  # listens on 0.0.0.0:3030 by default

Available Tools

Tool Description Default Arguments
get_data_source Returns the configured OpenSearch URL none
get_orion_configs Lists available Orion configuration files none
get_orion_metrics Lists metrics grouped by Orion config config_name="small-scale-udn-l3.yaml", version="4.20"
get_orion_metrics_with_meta Lists metrics plus metadata for Orion config config_name="small-scale-udn-l3.yaml", version="4.19"
get_orion_performance_data Returns raw performance values for config/metric/version config_name="small-scale-udn-l3.yaml", metric="podReadyLatency_P99", version="4.19", lookback="15"
openshift_report_on Generates a trend line for one or more OCP versions versions="4.19", lookback="15", metric="podReadyLatency_P99", config_name="small-scale-udn-l3.yaml"
openshift_report_on_pr NEW Analyzes performance impact of a specific Pull Request version="4.20", lookback="15", organization="openshift", repository="ovn-kubernetes", pull_request="2841"
has_openshift_regressed Scans all configs for changepoints version="4.19", lookback="15"
metrics_correlation Correlates two metrics & returns a scatter plot metric1="podReadyLatency_P99", metric2="ovnCPU_avg", config_name="trt-external-payload-cluster-density.yaml", version="4.19", lookback="15"

Pull Request Performance Analysis

Overview

The openshift_report_on_pr tool provides automated performance regression detection for GitHub Pull Requests. This feature compares the performance metrics of a specific PR against the periodic baseline performance to identify potential regressions.

How It Works

  1. Baseline Collection: Gathers periodic performance data for the specified OpenShift version over the lookback period
  2. PR Analysis: Runs performance tests specifically for the target Pull Request
  3. Comparison: Compares PR performance against the periodic baseline using a 10% threshold
  4. Multi-Config Testing: Tests across multiple Orion configurations for comprehensive coverage

Supported Configurations

The PR analysis runs against these key performance test configurations:

  • trt-external-payload-cluster-density.yaml - Cluster density and pod scaling tests
  • trt-external-payload-node-density.yaml - Node-level performance and resource utilization
  • trt-external-payload-node-density-cni.yaml - CNI-specific networking performance
  • trt-external-payload-crd-scale.yaml - Custom Resource Definition scaling tests

Interpreting Results

  • periodic_avg: Baseline performance metrics averaged over the lookback period
  • pull: Performance metrics from the specific PR's test runs
  • Regression Detection: Compare values using the 10% threshold:
    • (pull_value - periodic_avg) / periodic_avg > 0.10 indicates a potential regression
    • Values within ±10% are considered normal variance

Integration with AI/LLM

The response format is optimized for AI analysis. The LLM can:

  1. Automatically detect regressions by comparing periodic_avg vs pull metrics
  2. Apply the 10% threshold to determine significance
  3. Generate human-readable reports highlighting concerning changes
  4. Provide actionable insights about which metrics regressed and by how much

Documentation

For comprehensive documentation:

Example AI Analysis Prompt

Analyze this PR performance data and identify any regressions using a 10% threshold:
[paste the JSON response]

For each metric that shows >10% degradation, explain:
1. The metric name and what it measures
2. The baseline vs PR values  
3. The percentage change
4. Potential impact on users

Deployment

Container Image

podman build -t quay.io/YOUR_ORG/orion-mcp:latest .

OpenShift

To deploy to an OpenShift cluster, specify the ES_SERVER in kustomize/base/.env, e.g.:

ES_SERVER=https://USER:PASSWORD@SERVER:443

To deploy the application:

# Expose your quay credentials to fetch the container image
export QUAY_CRED='<base64 encoded pull secret>'

# Build and apply the manifests
kustomize build --load-restrictor=LoadRestrictionsNone ./kustomize/base | envsubst | oc apply -f -

To verify any changes to manifests, you can render them locally, e.g.:

kustomize build  ./kustomize/base | envsubst > manifests.yaml 

To access the service externally, expose it using an OpenShift Route and point your MCP client to http://<host>:3030.


Development

# Run linters & tests
flake8
pytest

# Auto-format with black & isort
black . && isort .

Contributing

Pull requests are very welcome! Please ensure you have read and adhere to the Code of Conduct.

  1. Fork the repository
  2. Create a new branch for your feature or bugfix
  3. Make your changes and add tests if applicable
  4. Submit a pull request with a clear description of your changes

License

Orion-MCP is distributed under the Apache 2.0 License. See the LICENSE file for full text.

About

No description, website, or topics provided.

Resources

License

Code of conduct

Stars

Watchers

Forks

Packages

 
 
 

Contributors