Skip to content

fix(python): model aliases in API + graceful port conflict handling#76

Merged
unamedkr merged 1 commit intomainfrom
fix/cli-aliases-port-check
Apr 12, 2026
Merged

fix(python): model aliases in API + graceful port conflict handling#76
unamedkr merged 1 commit intomainfrom
fix/cli-aliases-port-check

Conversation

@unamedkr
Copy link
Copy Markdown
Collaborator

Summary

  • Model aliases (llama3.2:1b, smollm2, phi3.5:mini) now work in Python API
  • Port conflict shows friendly error instead of SIGABRT crash

Test plan

  • Model.from_pretrained("llama3.2:1b") resolves to Llama-3.2-1B
  • Model.from_pretrained("phi3.5:mini") resolves to Phi-3.5-mini
  • quantcpp serve ... --port 8080 on busy port → "port 8080 is already in use"
  • Existing behavior unchanged for full names ("Phi-3.5-mini")

Fixes #74, #75

🤖 Generated with Claude Code

1. Model aliases in Python API (#74):
   - `Model.from_pretrained("llama3.2:1b")` now works (was: only full names)
   - Added `_MODEL_ALIASES` dict and `_resolve_model_name()` in __init__.py
   - Same aliases as CLI: smollm2, llama3.2:1b, phi3.5:mini, etc.

2. Port conflict handling (#75):
   - `quantcpp serve` now checks if port is in use before launching
   - Shows: "error: port 8080 is already in use. Try --port 8081"
   - Previously: SIGABRT crash with no error message

Fixes #74, #75

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@unamedkr unamedkr merged commit a8528f9 into main Apr 12, 2026
2 of 3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Python API Model.from_pretrained() does not accept CLI aliases

1 participant