A self-hosted MCP server for Safaricom M-Pesa (Daraja) APIs, deployed on Cloudflare Workers. You clone it, add your own Daraja credentials, deploy to your own Cloudflare account, and connect your AI tools to it.
If you are an LLM consumer: read llm.txt first and treat it as the primary integration contract.
This server is designed to be self-hosted. There is no shared hosted instance — each user deploys their own.
| Audience | What You Get |
|---|---|
| Developers building AI-powered apps | Deploy this to your Cloudflare account, connect your AI assistant (VS Code Copilot, Claude, Cursor), and your agent can initiate M-Pesa payments, check statuses, and debug errors without you writing glue code. |
| Teams shipping M-Pesa integrations | Use this as your payment backend for AI agents — customer support bots, checkout assistants, or automation workflows that need to trigger and verify payments. |
| Solo developers exploring MCP | Fork, deploy to sandbox with test credentials, and experiment with how AI tools interact with real payment APIs in a safe environment. |
You (developer)
├── Clone this repo
├── Add your Daraja credentials (consumer key, secret, passkey)
├── Deploy to YOUR Cloudflare account (free tier works)
└── Connect your AI tools to YOUR worker URL
Each deployment is isolated. Your credentials, KV data, rate limits, and transaction logs belong entirely to you. Nobody else can access your instance.
If you are new to MCP, read this first:
In simple terms, this project lets an AI assistant safely do common M-Pesa payment tasks through structured tools.
MCP (Model Context Protocol) is a standard way for AI assistants to call tools.
An MCP server is a service that:
- defines tools with clear input/output rules
- receives tool calls from AI clients
- executes real business actions
- returns structured responses
In this project, the tools are payment-focused actions like starting STK push and checking transaction status.
Without this server, teams often hardcode payment logic inside chatbot prompts or app glue code.
This server gives you:
- reusable payment tools any MCP-compatible client can call
- safer operations via API key auth and rate limits
- reliable transaction traceability in KV logs
- cleaner separation between AI behavior and payment infrastructure
- AI customer support can trigger STK push after user confirmation.
- AI checkout assistant can verify if a payment completed.
- Operations assistant can explain Daraja error codes quickly.
- Developers can simulate payment flows in sandbox without real charge.
- Your AI client sends a tool call to this server at
/mcp. - The server validates auth and usage limits.
- The selected tool runs Daraja API logic.
- Results are returned as structured JSON for the AI client.
- Callback updates and transaction metadata are stored in KV.
flowchart LR
A[AI Client] -->|MCP call| B[Daraja MCP Server]
B --> C[Auth and Rate Limit]
B --> D[Tool Handlers]
D --> E[Daraja API]
D --> F[Cloudflare KV]
E --> G["/callback endpoint"]
G --> F
For full architecture, sequence diagrams, and examples, see docs/ARCHITECTURE.md.
For detailed setup across MCP consumers (VS Code, stdio-first hosts, and custom clients), see docs/MCP_CONSUMERS.md.
List tools:
curl -H "x-api-key: <your_api_key>" https://<your-domain>/mcp/toolsHealth check:
curl https://<your-domain>/healthSee end-to-end illustrations and tool input/output examples in docs/ARCHITECTURE.md.
- Install dependencies:
npm install - Create local vars template:
npm run setup:local - Fill real values in
.dev.vars - Validate config:
npm run doctor - Start local server:
npm run dev - Check health:
GET /health
If you want a strict config check that fails on missing required values, run: npm run doctor -- --strict
Legend: Beginner = no prior MCP knowledge needed. Intermediate = some prior context helps.
MCP fundamentals:
- MCP introduction (Beginner, ~10 min): https://modelcontextprotocol.io/introduction
- MCP architecture concepts (Beginner, ~15 min): https://modelcontextprotocol.io/docs/learn/architecture
- MCP specification (Intermediate, ~20-30 min skim): https://spec.modelcontextprotocol.io/
Cloudflare basics for this project:
- Cloudflare Workers overview (Beginner, ~10 min): https://developers.cloudflare.com/workers/
- Wrangler CLI docs (Beginner, ~15 min): https://developers.cloudflare.com/workers/wrangler/
- Workers KV getting started (Beginner, ~15 min): https://developers.cloudflare.com/kv/get-started/
- Workers bindings overview (Intermediate, ~15 min): https://developers.cloudflare.com/workers/runtime-apis/bindings/
Daraja and M-Pesa docs:
- Safaricom developer portal (Beginner, ~5 min): https://developer.safaricom.co.ke/
- Daraja APIs catalog (Beginner, ~10 min): https://developer.safaricom.co.ke/apis
- Daraja Getting Started (Beginner, ~15 min): https://developer.safaricom.co.ke/apis/GettingStarted
- Daraja Authorization OAuth (Intermediate, ~10 min): https://developer.safaricom.co.ke/apis/Authorization
MCP clients and tooling:
- Use MCP servers in VS Code (Beginner, ~15 min): https://code.visualstudio.com/docs/copilot/chat/mcp-servers
- MCP configuration in VS Code (Intermediate, ~15 min): https://code.visualstudio.com/docs/copilot/reference/mcp-configuration
Diagrams and visuals:
- Mermaid intro (Beginner, ~10 min): https://mermaid.js.org/intro/
- Mermaid live editor (Beginner, hands-on): https://mermaid.live/edit
Implemented: Commit 1 - Project Bootstrap, Commit 2 - MCP Server Setup, Commit 3 - API Key Auth, Commit 4 - Rate Limiting (KV), Commit 5 - OAuth Token (Daraja), Commit 6 - STK Push, Commit 7 - Transaction Status, Commit 8 - Payment Verification Layer, Commit 9 - Callback Handler, Commit 10 - Simulation Tool, Commit 11 - Error Intelligence, Commit 12 - Workers AI Integration, Commit 13 - Logging + Observability, Commit 14 - Agents Integration (Future)
- Cloudflare Worker project scaffold
- Basic
fetchhandler - Health endpoint:
GET /health - MCP SDK integrated (
@modelcontextprotocol/sdk) - MCP server configured as
daraja-mcp-serverv1.1.0 - Basic tool registration with initial
get_usage_statustool - MCP transport endpoint:
/mcp - Tool discovery endpoint:
GET /mcp/tools - API key auth middleware for protected routes via
x-api-key - KV-backed daily rate limiting middleware (
USAGEnamespace) - Request limit:
50requests per UTC day - Daraja OAuth token tool:
get_access_token - Token caching in KV (
TOKENSnamespace) - STK Push tool:
stk_push - Daraja STK password generation:
Base64(shortCode + passkey + timestamp) - STK request/response logging in KV (
TRANSACTIONSnamespace) - Transaction status tool:
check_transaction_status - Normalized response fields:
status,resultCode,responseCode,isComplete - Payment verification tool:
verify_payment_intent - Verification checks: amount matching and optional phone matching
- Callback endpoint:
POST /callback - Callback payload storage in KV (
CALLBACKSnamespace) - Development simulation tool:
simulate_payment(no external API calls) - Daraja error explanation tool:
explain_error_code - Transaction log summary tool:
summarize_transaction_logs - Optional Workers AI enhancement for natural language summaries
- Structured request and error logging utilities
DEBUG_MODE=trueenables request/error log emission- Orchestration planning tool:
orchestrate_payment_workflow - Provides agent-to-agent payment workflow plans for future Cloudflare Agents runtime
- Protected routes require header:
x-api-key: <your_api_key> - Public route:
GET /health
Set API key secret before deploy:
wrangler secret put API_KEYCreate a KV namespace and bind it as USAGE in your wrangler.toml:
[[kv_namespaces]]
binding = "USAGE"
id = "<your-usage-kv-namespace-id>"
preview_id = "<your-usage-kv-preview-id>"If the daily quota is exhausted, protected endpoints return 429.
Required secrets:
wrangler secret put DARAJA_CONSUMER_KEY
wrangler secret put DARAJA_CONSUMER_SECRETOptional secrets/vars:
DARAJA_ENV=sandbox(default) orproductionDARAJA_BASE_URL= custom override for Daraja base URL
Add token cache KV binding in wrangler.toml:
[[kv_namespaces]]
binding = "TOKENS"
id = "<your-tokens-kv-namespace-id>"
preview_id = "<your-tokens-kv-preview-id>"Required configuration:
wrangler secret put DARAJA_SHORTCODE
wrangler secret put DARAJA_PASSKEY
wrangler secret put DARAJA_CALLBACK_URLImportant notes:
- A callback endpoint is required for end-to-end STK flow because final payment outcomes are sent asynchronously by Daraja.
- This server already implements the callback route at POST /callback.
- Set DARAJA_CALLBACK_URL to a real public HTTPS URL, for example https:///callback.
- For sandbox, use the Lipa Na M-Pesa Online passkey for shortcode 174379. Do not use Security Credential for STK password generation.
- STK password formula is Base64(shortCode + passkey + timestamp), where timestamp format is YYYYMMDDHHmmss.
Optional:
DARAJA_TRANSACTION_TYPE=CustomerPayBillOnline(default) orCustomerBuyGoodsOnline
Add transaction log KV binding in wrangler.toml:
[[kv_namespaces]]
binding = "TRANSACTIONS"
id = "<your-transactions-kv-namespace-id>"
preview_id = "<your-transactions-kv-preview-id>"stk_push input fields:
amountphoneNumberaccountReferencetransactionDesccallbackUrl(optional override)transactionType(optional override)
Set your Daraja callback to this endpoint:
https://<your-worker-domain>/callback
The callback endpoint stays unauthenticated by design so Safaricom can deliver payment updates.
Add callback storage KV binding in wrangler.toml:
[[kv_namespaces]]
binding = "CALLBACKS"
id = "<your-callbacks-kv-namespace-id>"
preview_id = "<your-callbacks-kv-preview-id>"Quick local setup:
npm run setup:local
npm install
npm run doctorThe doctor command checks required Daraja and API key variables before starting the worker.
Use npm run doctor -- --strict when you want missing required keys to fail fast.
npm install
npm run devnpm testInstall coverage tooling (if starting from a minimal setup):
npm install --save-dev vitest @vitest/coverage-v8Generate coverage output:
npx vitest run --coverageGenerate and refresh coverage report in README:
npm run coverage:updateLast updated: 2026-03-22T00:35:43.569Z
| Metric | Coverage | Covered | Total |
|---|---|---|---|
| Statements | 80.69% | 347 | 430 |
| Branches | 61.98% | 225 | 363 |
| Functions | 84.50% | 60 | 71 |
| Lines | 80.42% | 337 | 419 |
Refresh with: npm run coverage:update
Codecov upload is intentionally done outside CircleCI in this repository.
Best method for this project: manual local upload after coverage generation, to keep CI lean and avoid token handling in pipeline jobs.
If you later choose CI-based upload, use the same verified CLI flow in your CI runner and keep CODECOV_TOKEN only in CI secrets.
Recommended flow:
- Generate coverage:
npx vitest run --coverage - Verify and install Codecov CLI for your OS
- Upload with token:
./codecov upload-process -t "$CODECOV_TOKEN" -f coverage/coverage-final.json -F vitestWindows (PowerShell):
$ProgressPreference = 'SilentlyContinue'
Invoke-WebRequest -Uri https://keybase.io/codecovsecurity/pgp_keys.asc -OutFile codecov.asc
gpg.exe --import codecov.asc
Invoke-WebRequest -Uri https://cli.codecov.io/latest/windows/codecov.exe -OutFile codecov.exe
Invoke-WebRequest -Uri https://cli.codecov.io/latest/windows/codecov.exe.SHA256SUM -OutFile codecov.exe.SHA256SUM
Invoke-WebRequest -Uri https://cli.codecov.io/latest/windows/codecov.exe.SHA256SUM.sig -OutFile codecov.exe.SHA256SUM.sig
gpg.exe --verify codecov.exe.SHA256SUM.sig codecov.exe.SHA256SUM
if ((Compare-Object -ReferenceObject ((($(certUtil -hashfile codecov.exe SHA256)[1]), 'codecov.exe') -join ' ') -DifferenceObject (Get-Content codecov.exe.SHA256SUM)).Length -eq 0) {
Write-Output 'SHASUM verified'
} else {
exit 1
}
.\codecov.exe upload-process -t "$env:CODECOV_TOKEN" -f coverage/coverage-final.json -F vitestLinux:
curl https://keybase.io/codecovsecurity/pgp_keys.asc | gpg --no-default-keyring --keyring trustedkeys.gpg --import
curl -Os https://cli.codecov.io/latest/linux/codecov
curl -Os https://cli.codecov.io/latest/linux/codecov.SHA256SUM
curl -Os https://cli.codecov.io/latest/linux/codecov.SHA256SUM.sig
gpg --verify codecov.SHA256SUM.sig codecov.SHA256SUM
shasum -a 256 -c codecov.SHA256SUM
sudo chmod +x codecov
./codecov upload-process -t "$CODECOV_TOKEN" -f coverage/coverage-final.json -F vitestmacOS:
curl https://keybase.io/codecovsecurity/pgp_keys.asc | gpg --no-default-keyring --keyring trustedkeys.gpg --import
curl -Os https://cli.codecov.io/latest/macos/codecov
curl -Os https://cli.codecov.io/latest/macos/codecov.SHA256SUM
curl -Os https://cli.codecov.io/latest/macos/codecov.SHA256SUM.sig
gpg --verify codecov.SHA256SUM.sig codecov.SHA256SUM
shasum -a 256 -c codecov.SHA256SUM
sudo chmod +x codecov
./codecov upload-process -t "$CODECOV_TOKEN" -f coverage/coverage-final.json -F vitestThis repository now includes CircleCI pipeline config at .circleci/config.yml.
Pipeline behavior:
- CI on branches and tags:
npm run checknpm testnpm run test:e2enpm run test:coverage- coverage artifacts stored in CircleCI
- Codecov upload via CircleCI Codecov orb (
codecov/codecov@5)
- CD to sandbox on
mainbranch. - Production deployment is intentionally disabled in CircleCI.
- Smoke tests are currently disabled in the workflow.
Required CircleCI environment variables:
CLOUDFLARE_API_TOKENCLOUDFLARE_ACCOUNT_ID
Required runtime secret variables for sandbox deploy (auto-synced to Worker secrets):
SANDBOX_API_KEY->API_KEYSANDBOX_DARAJA_CONSUMER_KEY->DARAJA_CONSUMER_KEYSANDBOX_DARAJA_CONSUMER_SECRET->DARAJA_CONSUMER_SECRETSANDBOX_DARAJA_SHORTCODE->DARAJA_SHORTCODESANDBOX_DARAJA_PASSKEY->DARAJA_PASSKEYSANDBOX_DARAJA_CALLBACK_URL->DARAJA_CALLBACK_URL
Required sandbox KV namespace variables for deploy config generation:
SANDBOX_USAGE_KV_IDSANDBOX_USAGE_KV_PREVIEW_IDSANDBOX_TOKENS_KV_IDSANDBOX_TOKENS_KV_PREVIEW_IDSANDBOX_TRANSACTIONS_KV_IDSANDBOX_TRANSACTIONS_KV_PREVIEW_IDSANDBOX_CALLBACKS_KV_IDSANDBOX_CALLBACKS_KV_PREVIEW_ID
Optional sandbox runtime variables:
SANDBOX_DARAJA_ENV->DARAJA_ENVSANDBOX_DARAJA_BASE_URL->DARAJA_BASE_URLSANDBOX_DARAJA_TRANSACTION_TYPE->DARAJA_TRANSACTION_TYPE
Use the "Codecov CLI Upload (Manual, Local OS)" section above when you need local/manual upload.
Required Codecov variable in CircleCI:
CODECOV_TOKEN
This repository uses Husky pre-commit hooks.
- Hook file:
.husky/pre-commit - Runs before each commit:
npm run checknpm test
For security, wrangler.toml does not store real KV namespace IDs.
Generate a temporary deploy config from environment variables and deploy using that generated file.
Example (PowerShell):
$env:SANDBOX_USAGE_KV_ID="<kv-id>"
$env:SANDBOX_USAGE_KV_PREVIEW_ID="<kv-preview-id>"
$env:SANDBOX_TOKENS_KV_ID="<kv-id>"
$env:SANDBOX_TOKENS_KV_PREVIEW_ID="<kv-preview-id>"
$env:SANDBOX_TRANSACTIONS_KV_ID="<kv-id>"
$env:SANDBOX_TRANSACTIONS_KV_PREVIEW_ID="<kv-preview-id>"
$env:SANDBOX_CALLBACKS_KV_ID="<kv-id>"
$env:SANDBOX_CALLBACKS_KV_PREVIEW_ID="<kv-preview-id>"
node scripts/generate-wrangler-sandbox-config.mjs
npx wrangler deploy --name daraja-mcp-server-sandbox -c .wrangler.sandbox.tomlnpm run deployUse the production checklist in docs/RELEASE_RUNBOOK.md for:
- pre-release validation (
npm run check, tests, Terraform validate) - Cloudflare secrets and bindings verification
- deployment sequence (
terraform applyandnpm run deploy) - post-deploy smoke tests for
/health,/mcp/tools, and callback routing
Release governance documents:
GET /health
{
"ok": true,
"service": "daraja-mcp-server",
"status": "healthy",
"timestamp": "2026-03-22T00:00:00.000Z"
}After deploying to your own Cloudflare account, connect your AI coding assistant or production agent to your Worker URL.
For developers: Point your IDE's MCP client at your deployed server so your AI assistant can test M-Pesa flows, debug error codes, and verify transactions — no context-switching to Postman or writing throwaway scripts.
For production AI agents: Connect your chatbot, customer support agent, or automation workflow so it can initiate payments, check status, and handle M-Pesa operations on behalf of real users.
Copy-paste config templates are in examples/:
| Platform | Config File | Setup |
|---|---|---|
| VS Code (Copilot Chat) | examples/vscode-mcp.json |
Copy to .vscode/mcp.json |
| Claude Code | examples/claude-code-mcp.json |
Copy to .mcp.json or claude mcp add |
| Claude Desktop | examples/claude-desktop-config.json |
Merge into claude_desktop_config.json |
| Cursor | examples/cursor-mcp.json |
Copy to .cursor/mcp.json |
| Windsurf | examples/windsurf-mcp.json |
Add via Windsurf MCP settings |
| OpenAI Codex | examples/codex-consumer-config.toml |
Add to .codex/config.toml |
All templates use https://<your-domain>/mcp — replace with your deployed Worker URL.
For detailed setup instructions per platform, see docs/MCP_CONSUMERS.md.
This project is fully configured for OpenAI Codex (CLI, IDE extension, and Codex app).
npm install -g @openai/codex
cd daraja_mcp_server
codexCodex automatically reads AGENTS.md and loads project-scoped configuration from .codex/config.toml.
| Component | Location | Purpose |
|---|---|---|
AGENTS.md |
Root | Project context, conventions, commands |
.codex/config.toml |
Project | MCP servers, agent config, skills |
.codex/agents/ |
Project | Custom agent definitions (6 agents) |
.agents/skills/ |
Project | Reusable workflow skills (6 skills) |
.codex/PLANS.md |
Project | Planning template for complex tasks |
.codex/code_review.md |
Project | Review standards reference |
docs/CODEX_GUIDE.md |
Docs | Full Codex usage guide |
# Full codebase analysis
Analyze this codebase. Spawn codebase_analyst, reviewer, and pr_explorer in parallel.
# PR review
Review this branch against main with security, correctness, and test coverage agents.
# TDD implementation
$tdd-workflow — Fix [describe issue] with full TDD.
See docs/CODEX_GUIDE.md for complete workflow documentation.