This project provides a Model Context Protocol (MCP) server for the TALXIS CLI, enabling dynamic discovery and invocation of CLI tools via the MCP stdio transport. The MCP server is distributed as a global .NET tool and can be easily integrated with GitHub Copilot and other MCP-compatible tools.
- Tool Discovery & Execution — Dynamically discovers and exposes all TALXIS CLI commands as MCP tools with typed input schemas
- Task-Augmented Execution — Long-running tools (deploy, import) support the MCP tasks protocol for async "call-now, fetch-later" execution with cancellation support
- Structured Logging — Streams real-time log output from CLI subprocesses to MCP clients via
notifications/message - Progress Notifications — Emits
notifications/progressduring long-running tool calls when the client provides aprogressToken, with rate-limiting to avoid flooding - Workspace Roots — Requests the client's workspace roots via
roots/listand uses the primary root as the subprocess working directory - Log Redaction — Automatically redacts passwords, tokens, and filesystem paths from log output before forwarding to clients
- Copilot Instructions — Manages
.github/copilot-instructions.mdfiles to guide AI assistants working with TALXIS repositories
Install the MCP server as a global .NET tool (this will add the txc-mcp alias):
dotnet tool install --global TALXIS.CLI.MCPUpdate to the latest version:
dotnet tool update --global TALXIS.CLI.MCPSee the package on NuGet: TALXIS.CLI.MCP
Click the link below to automatically add the TALXIS CLI MCP server to your VS Code configuration:
📦 Install TALXIS CLI MCP Server
This will add the server to your user configuration, making it available across all workspaces.
Alternatively, you can manually create a .vscode/mcp.json file in your project with the following content:
{
"inputs": [],
"servers": {
"TALXIS CLI Public": {
"type": "stdio",
"command": "txc-mcp"
}
}
}This will allow GitHub Copilot and other tools to discover and invoke TALXIS CLI commands via MCP.
When developing the MCP server locally, you can run it directly from source and configure VS Code to use your local build. In your .vscode/mcp.json, set the server command to launch the project via dotnet run:
{
"inputs": [],
"servers": {
"TALXIS CLI Dev": {
"type": "stdio",
"command": "dotnet",
"args": [
"run",
"--project",
"${workspaceFolder}/src/TALXIS.CLI.MCP/TALXIS.CLI.MCP.csproj"
]
}
}
}- Adjust the path in
argsto match your local project location if needed. - This setup allows you to test changes without reinstalling the global tool.
You can use the Model Context Protocol Inspector for interactive inspection:
npx @modelcontextprotocol/inspector dotnet run --project src/TALXIS.CLI.MCPNote: The Inspector is an interactive web browser application designed for manual testing and exploration. It is not suitable for automated testing scenarios.
For debugging or automated testing, you can interact with the MCP server using JSON-RPC messages over stdin/stdout:
# Start the server
dotnet run --project src/TALXIS.CLI.MCP
# Then send JSON-RPC messages via stdin (one per line):
# 1. Initialize the connection (required by MCP protocol)
{"jsonrpc": "2.0", "id": 1, "method": "initialize", "params": {"protocolVersion": "2025-06-18", "capabilities": {}, "clientInfo": {"name": "test-client", "version": "1.0.0"}}}
# 2. List available tools
{"jsonrpc": "2.0", "id": 2, "method": "tools/list", "params": {}}
# 3. Call a specific tool (example)
{"jsonrpc": "2.0", "id": 3, "method": "tools/call", "params": {"name": "tool-name", "arguments": {}}}
# 4. Call a tool with progress tracking
{"jsonrpc": "2.0", "id": 4, "method": "tools/call", "params": {"name": "environment_deploy", "arguments": {"Package": "MyPackage"}, "_meta": {"progressToken": "my-progress-1"}}}
# 5. Call a long-running tool as a task (returns immediately with task ID)
{"jsonrpc": "2.0", "id": 5, "method": "tools/call", "params": {"name": "environment_deploy", "arguments": {"Package": "MyPackage"}, "task": {}}}
# 6. Poll task status
{"jsonrpc": "2.0", "id": 6, "method": "tasks/get", "params": {"taskId": "<task-id-from-step-5>"}}
# 7. Get task result (blocks until complete)
{"jsonrpc": "2.0", "id": 7, "method": "tasks/result", "params": {"taskId": "<task-id-from-step-5>"}}To list available tools:
{"jsonrpc": "2.0", "id": 1, "method": "tools/list", "params": {}}To call a tool (replace arguments as needed):
{"jsonrpc": "2.0", "id": 2, "method": "tools/call", "params": {"name": "workspace_component_create", "arguments": {"ShortName": "pp-entity", "name": "TestEntity", "Param": ["EntityType=InvalidType"]}}}The MCP server never runs an interactive auth flow. Every txc tool
subprocess spawned by this server is forced headless — stdout is reserved
for JSON-RPC frames and the child has stdin/stdout redirected, so a
browser/device-code/masked-prompt attempt would either hang the session
or corrupt the transport. CliSubprocessRunner sets
TXC_NON_INTERACTIVE=1 unconditionally on every spawn.
Prerequisites, before invoking any tool that touches a Dataverse environment (or any other Connection-bound tool):
- On the human's machine, run
txc config auth login(interactive browser) ortxc config auth add-service-principalonce to register a credential and prime the MSAL token cache. - Run
txc config connection create <name> --provider dataverse ...to register the endpoint. - Run
txc config profile create <name> --auth <alias> --connection <name>to bind them, thentxc config profile select <name>(or pin the workspace viatxc config profile pin).
After that, MCP tool calls resolve the active profile silently via the
acquired token cache or stored SPN secret. If resolution fails (expired
refresh token, missing credential, broken config), the subprocess exits
non-zero and the MCP server surfaces the error through the tool-call
result; the structured log line includes the fail-fast remedy string
(txc config profile validate <name>).
Any Connection-touching MCP tool accepts an optional profile argument
which is forwarded to the child as --profile <name>. This lets a
single MCP session switch between profiles per call without restarting
the server — e.g. an agent can target customer-a-dev for one tool and
customer-b-prod for the next.
The MCP server inherits the client's environment and the child CLI
subprocess inherits that. The following are meaningful to txc; anything
else is ignored:
| Variable | Purpose |
|---|---|
TXC_PROFILE |
Select active profile for every call in this MCP session (overridden by per-call profile argument when supplied). |
TXC_CONFIG_DIR |
Override entire config directory (total CI isolation). |
TXC_NON_INTERACTIVE |
Redundant here — MCP already forces this to 1. |
TXC_LOG_FORMAT |
Redundant here — MCP already forces json. |
AZURE_CLIENT_ID, AZURE_CLIENT_SECRET, AZURE_TENANT_ID |
Service-principal fallback for CI / headless runners. |
AZURE_FEDERATED_TOKEN_FILE |
Kubernetes / Azure workload-identity federation. |
ACTIONS_ID_TOKEN_REQUEST_URL, ACTIONS_ID_TOKEN_REQUEST_TOKEN |
GitHub Actions OIDC federation. |
TXC_ADO_ID_TOKEN_REQUEST_URL, TXC_ADO_ID_TOKEN_REQUEST_TOKEN |
Azure DevOps pipelines workload-identity federation (legacy PAC_ADO_* also honored). |
Secrets (client secrets, PATs, certificate passwords) are never
accepted as plain MCP tool arguments — they are stored in the OS-level
secret vault via txc config auth add-service-principal and referenced
from config by SecretRef handle only.
McpLogForwarder runs every child stderr JSON log line through
LogRedactionFilter before emitting notifications/message to the MCP
client. Patterns covered: Bearer <token>, Authorization: headers,
bare JWTs, connection-string secret keys (Password, ClientSecret,
AccessToken, RefreshToken, etc.) and URL query-param secrets
(code=, token=, access_token=). This is the last-chance belt for
accidental logger.LogError(ex, ...) leaks where the Dataverse or
Microsoft.Identity SDKs embed tokens inside exception messages.
When the HTTP/SSE transport ships, txc-mcp will be an OAuth resource
server only — Entra ID remains the authorization server. The following
patterns are forbidden by the MCP spec and will not be implemented:
- Token passthrough: never forward a client's bearer token directly to Dataverse — always use on-behalf-of or a separate service account.
- Tokens in URIs: never embed access tokens in query strings.
- Non-HTTPS redirects (except loopback for local dev).
- Missing PKCE: all OAuth flows must use PKCE.
- Missing audience check: tokens must be audience-bound to the
txc-mcpresource URI (RFC 8707).
See docs/mcp-http-auth-notes.md (design notes, no code in v1).
MCP clients like GitHub Copilot have practical limits on the number of tools they can handle effectively. With 128+ CLI commands registered as MCP tools, the full catalog causes context bloat — the client's LLM spends tokens parsing tool schemas instead of reasoning about the user's task.
txc-mcp uses progressive disclosure to expose only a small set of always-on tools at startup, then dynamically injects additional tools as needed:
Always-on tools (9 total):
- 6 guide tools —
guide,guide_workspace,guide_environment,guide_deployment,guide_data,guide_config— domain-scoped tool discovery via LLM sampling execute_operation— bridges same-turn execution for tools discovered by a guideget_skill_details— retrieves public developer-facing skills (documentation, best practices)copilot-instructions— manages.github/copilot-instructions.mdfor AI assistants
- Guide call — The client calls a guide tool (e.g.,
guide_workspace) with a natural language query describing the task. - Sampling — The guide sends a
sampling/createMessagerequest to the client's LLM with the full internal catalog, asking it to select the most relevant tools. Internal reasoning skills are injected for domain-specific expertise. - Same-turn execution — The guide returns structured results with tool names and schemas. The client can immediately call
execute_operationwith the selected tool name and arguments — no round-trip needed. - Direct calls (next turn) — Matched tools are injected into the
ActiveToolSet. Clients discover injected tools by re-fetching tools/list on subsequent turns, and can then call them directly by name.
txc-mcp has a two-tier skills system:
- Internal skills (proprietary) — Embedded
.mdfiles inSkills/Internal/containing Power Platform development expertise, decision trees, and workflow patterns. These are loaded byGuideReasoningEngineand injected into sampling prompts. They are never exposed to clients — they guide the server's own reasoning. - Public skills (developer-facing) — Loaded from
TALXIS.CLI.Features.Docsembedded resources. Accessible via theget_skill_detailsMCP tool and thetxc docsCLI command. These contain documentation, best practices, and how-to guides for developers.
Guide tools and internal skills encode a local-first philosophy: prefer local workspace operations (scaffolding, validation, schema inspection) over live environment operations (API calls to Dataverse). Environment operations take 30 seconds to 5 minutes and should only be used for inspection, troubleshooting, or deployment after local validation is complete.
For more details, see the main TALXIS CLI README.