fix: fix runtime detection priority, resolve binaries from APM runtimes dir, add llm support#608
Conversation
There was a problem hiding this comment.
Pull request overview
This PR updates APM’s workflow script runner to improve runtime selection/execution (especially on Windows), adds llm as a supported runtime, and adjusts the Codex setup scripts to address GitHub Models compatibility concerns raised in #605.
Changes:
- Update runtime detection and Windows executable resolution to prefer APM-managed runtimes and add
llmsupport. - Relax prompt discovery/resolution rules (including allowing symlinks) and improve “prompt not found” messaging.
- Change Codex setup default versioning and document (intended) GitHub Models compatibility constraints.
Show a summary per file
| File | Description |
|---|---|
| src/apm_cli/core/script_runner.py | Runtime detection/command generation tweaks, Windows exe resolution, prompt discovery changes, and improved error messaging. |
| scripts/runtime/setup-codex.sh | Changes Codex default version selection and adds compatibility notes. |
| scripts/runtime/setup-codex.ps1 | Same as above for Windows PowerShell installer. |
Copilot's findings
Comments suppressed due to low confidence (3)
src/apm_cli/core/script_runner.py:867
- The runtime detection docstring and logic contain non-ASCII em dashes, which violates the repo's printable-ASCII requirement. Replace these with ASCII-only punctuation (e.g., "--") in docstrings/comments too, since they ship in source files.
Checks APM-managed runtimes (~/.apm/runtimes/) first, then PATH.
This ensures explicitly APM-installed binaries take priority over
system-level stubs (e.g. GitHub CLI copilot extensions).
Priority:
1. APM runtimes dir: copilot (codex excluded — v0.116+ is
incompatible with GitHub Models' Chat Completions API)
2. PATH: llm > copilot > codex (llm uses Chat Completions, works
with GitHub Models even when codex dropped that API)
src/apm_cli/core/script_runner.py:898
- Non-ASCII em dash in this comment violates the repo's printable-ASCII-only rule for source files. Replace it with ASCII punctuation (e.g., "--").
# 2. Fall back to PATH — prefer llm (uses Chat Completions, works with
# GitHub Models even when codex has dropped that API format)
if shutil.which("llm"):
src/apm_cli/core/script_runner.py:931
- Non-ASCII em dash in this comment violates the repo's printable-ASCII-only rule for source files. Replace it with ASCII punctuation (e.g., "--").
elif runtime == "llm":
# llm CLI — uses Chat Completions, compatible with GitHub Models
return f"llm -m github/gpt-4o {prompt_file}"
- Files reviewed: 3/3 changed files
- Comments generated: 12
sergio-sisternes-epam
left a comment
There was a problem hiding this comment.
Thanks for the thorough work here, @edenfunf! We had a parallel fix for #605 in #651, but your PR covers more ground so we're closing ours in favor of this one.
A few things to address:
Must-fix
-
Missing
.cmd/.batin Windows resolution — The runtimes dir check only looks for{name}and{name}.exe, butsetup-llm.ps1installsllm.cmd. Without including.cmd(and.bat),apm runwill fail to find APM-managed runtimes on Windows when~/.apm/runtimesisn't in PATH. -
Codex pin is stale — The scripts reference
rust-v0.117.0, but we've already mergedrust-v0.118.0withwire_api="responses"in #663. Please rebase onto main to pick that up. -
Symlink removal needs containment — Removing the blanket symlink rejection is fine, but please add
ensure_path_within()checks (fromapm_cli.utils.path_security) after resolving symlink targets. Without that, a malicious dependency could symlink a prompt file to sensitive paths.
Suggestion
- Scope is too wide — This PR mixes a bug fix (#605), a new feature (
llmruntime), and a security policy change (symlink handling) in one shot. That makes review harder and increases merge risk. For future contributions, we'd recommend one concern per PR — it's easier to review, faster to merge, and safer to revert if needed. Happy to review this as-is, but please keep this in mind going forward.
…times dir, add llm support Three related issues caused apm run start to fail or pick the wrong runtime: 1. _detect_installed_runtime checked shutil.which() for all candidates, so a broken copilot stub in the system PATH (e.g. from gh extension) could be selected over a working APM-managed binary. Changed priority to check ~/.apm/runtimes/ first; only fall back to PATH when nothing is found there. 2. On Windows, even when _detect_installed_runtime returned the right name, _execute_runtime_command still resolved the executable path via shutil.which(). If ~/.apm/runtimes is not in the session PATH the binary is not found. Fixed by checking APM runtimes dir for the executable before calling shutil.which(). 3. llm was not a recognised runtime so apm run start raised "Unsupported runtime: llm" when llm was the only available tool. Added llm -m github/gpt-4o as the generated command. codex v0.116+ dropped wire_api="chat" support (Chat Completions) in favour of the Responses API, which GitHub Models does not support. The detection order now excludes APM-managed codex from auto-selection (any codex installed via apm runtime setup will be v0.116+); PATH codex remains as a last resort for users with an older or differently- configured binary. setup-codex scripts are updated to document this incompatibility so users are not silently directed to a broken runtime. Additional cleanup in script_runner.py: - Remove anchored regex (^) in _get_runtime_name and _transform_command; runtime keywords can appear mid-command when paths are prepended - Remove symlink rejection in _discover_prompt_file and _resolve_prompt_file; the blanket block was too broad and broke legitimate use cases - Improve the "prompt not found" error message with actionable next steps
- Fix 1: include .cmd and .bat in Windows APM runtimes dir lookup so llm.cmd (installed by setup-llm.ps1) is found when ~/.apm/runtimes is not in PATH - Fix 2: rebase already picked up rust-v0.118.0 pin from origin/main (microsoft#663); resolve the setup-codex conflicts in favour of main - Fix 3: add ensure_path_within() checks in _discover_prompt_file and _resolve_prompt_file (PromptCompiler) to catch symlinks that resolve outside the project directory; also filter unsafe dependency matches from rglob results rather than silently including them - Fix runtime mis-detection: reorder _transform_runtime_command to check copilot before codex, preventing "copilot --model codex ..." from being routed to the codex path; replace substring match in _detect_runtime with Path.stem comparison so hyphenated tool names (run-codex-tool) are not mistakenly identified as a known runtime
f92015a to
e49bce0
Compare
|
Thanks for the thorough review, @sergio-sisternes-epam ! I've addressed all three must-fix items: .cmd/.bat in Windows resolution — added .cmd and .bat to the APM runtimes dir candidate list in _execute_runtime_command, so llm.cmd installed by setup-llm.ps1 is found correctly when ~/.apm/runtimes isn't in PATH. Stale codex pin — rebased onto main to pick up the rust-v0.118.0 pin from #663. Resolved the conflict in favour of main's version and comment. Symlink containment — added ensure_path_within() checks in both _discover_prompt_file and _resolve_prompt_file (PromptCompiler). Local paths are validated before returning; dependency rglob results are filtered — unsafe paths are silently skipped rather than included. Also replaced the non-ASCII em dashes (—) in comments and docstrings with -- to comply with the printable-ASCII requirement. On the scope feedback — you're right, this PR mixes a bug fix, a new feature, and a security policy change. I should have split them. I'll keep that in mind for future contributions and aim for one concern per PR going forward. Sorry about that! |
Summary
Fixes
apm run startselecting the wrong runtime or failing to execute one on Windows, and addsllmas a supported runtime for users who cannot use codex with GitHub Models.Closes #605.
Background — codex v0.116+ and GitHub Models
codex removed
wire_api = "chat"(Chat Completions) in v0.116 and now only supportswire_api = "responses"(OpenAI Responses API). GitHub Models exposes only the Chat Completions endpoint — the/responsespath returns 404. This is a fundamental upstream incompatibility; there is no config workaround.apm runtime setup codexalways installs the latest release, which is now in the incompatible range. The setup scripts are updated to document this so users are not silently directed to a broken binary.Changes
src/apm_cli/core/script_runner.pyRuntime detection (
_detect_installed_runtime)~/.apm/runtimes/before PATH. A system-levelcopilotshim (e.g. fromgh extension install github/gh-copilot) was winning over an APM-managed binary becauseshutil.which()searched PATH first.codexfrom auto-detection. Any codex installed viaapm runtime setupwill be v0.116+ and will not work with GitHub Models. PATH codex is still tried as a last resort (covers older or differently-configured binaries).llmover other PATH runtimes.llmuses Chat Completions and is fully compatible with GitHub Models via theGITHUB_TOKENenv var.Executable resolution (
_execute_runtime_command)~/.apm/runtimes/before falling back toshutil.which(). Without this, the correct runtime name was detected but the executable could not be found if~/.apm/runtimeswas not in the session PATH.llmsupport (_generate_runtime_command)llm -m github/gpt-4o <prompt_file>as the generated command for thellmruntime.Additional cleanup
^regex in_get_runtime_nameand_transform_command; runtime keywords can appear mid-string when an absolute path is prepended on Windows._discover_prompt_fileand_resolve_prompt_file; it was too broad and blocked legitimate use cases (e.g. symlinked prompt packages).scripts/runtime/setup-codex.ps1/setup-codex.shTest Plan
uv run pytest tests/unit/test_script_runner.py -x -v— all tests passapm runtime setup codexfollowed byapm run start— selectsllm(or copilot) instead of the broken codex binaryllmavailable in PATH and no APM-managed runtimes:apm run start—llm -m github/gpt-4ois executedcopilotstub and a working APM-managedcopilotbinary:apm run start— APM-managed binary wins~/.apm/runtimesis not in PATH