All notable changes to QuantCoder CLI will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
- Default evolution generations reduced from 10 to 3 (more practical for real-world use)
- Backtest results now display 6 structured metrics: Sharpe Ratio, Total Return, CAGR, Max Drawdown, Win Rate, Total Trades
evolve showelite pool display includes CAGR, Win Rate, and Trades- Evolution engine final results log includes all 6 metrics
--push-to-qcflag onevolve start— automatically creates a QuantConnect project with the best evolved variant, uploads code, and compiles
- Cloud LLM providers removed — Anthropic, OpenAI, Mistral (cloud), and DeepSeek providers have been deleted. QuantCoder now runs exclusively on local models via Ollama.
- No API keys required —
load_api_key()andsave_api_key()are now no-ops. The CLI no longer prompts for API keys on startup. - ModelConfig simplified — Removed
coordinator_provider,code_provider(duplicate),risk_provider,summary_provider,summary_model,ollama_modelfields. New fields:code_model,reasoning_model,ollama_timeout. - LLMFactory API changed — Now uses task-based routing:
LLMFactory.create(task="coding")instead ofLLMFactory.create("anthropic", api_key="..."). - Dependencies removed —
openai,anthropic,mistralaipackages no longer required.
- Ollama-only local inference — All LLM calls route through Ollama
qwen2.5-coder:14bfor code generation, refinement, error fixingmistralfor reasoning, summarization, chat
- Task-based model routing —
LLMFactory.create(task=...)automatically selects the right model - OllamaProvider enhancements —
check_health(),list_models(), configurable timeout (default 600s) - Backwards-compatible config loading — Old config files with unknown fields are handled gracefully;
/v1suffix stripped fromollama_base_url
- Multi-Agent Architecture: Specialized agents for algorithm generation
- Autonomous Pipeline: Self-improving strategy generation with learning database
- Library Builder: Batch strategy generation across 13+ categories
- AlphaEvolve Evolution: LLM-driven structural variation of algorithms
- Tool System: Pluggable architecture (search, download, summarize, generate, validate, backtest)
- Rich Terminal UI: Modern CLI with syntax highlighting, panels, progress indicators
- MCP Integration: QuantConnect Model Context Protocol for validation and backtesting
- Configuration System: TOML-based with dataclasses
- Package renamed from
quantclitoquantcoder
- Tkinter GUI (replaced by Rich terminal in pre-release)
- Legacy OpenAI SDK v0.28 support
- All cloud LLM providers (Anthropic, OpenAI, Mistral cloud, DeepSeek)
- Optional dependency groups
[openai],[anthropic],[mistral],[all-llm]
- LLM Client Abstraction (
llm_client.py)LLMClientclass with modern OpenAI SDK v1.x+ supportLLMResponsedataclass for standardized responses- Token usage tracking
simple_prompt()convenience method
- QuantConnect Static Validator (
qc_validator.py)QuantConnectValidatorclass for code analysis- Division by zero detection
- Missing
.IsReadyindicator checks Nonevalue risk detection in comparisonsmax()/min()on potentially None values- Portfolio access pattern validation
- Severity levels (error, warning, info)
- Formatted report generation
- Unit Tests (
tests/test_llm_client.py)- LLMClient initialization tests
- Chat completion tests
- Error handling tests
- Documentation
TESTING_GUIDE.md- Comprehensive testing documentationMAIN_VS_BETA.md- Branch comparison guide.env.example- Environment variable template
processor.py: Refactored to useLLMClientinstead of direct OpenAI callsprocessor.py: Enhanced code generation prompts with defensive programming requirements- Added runtime safety check requirements
- Added
IsReadycheck reminders - Added None guard requirements
- Added zero-division protection patterns
cli.py: Added verbose flag handling improvementssetup.py: Updated dependencies for OpenAI v1.x+requirements.txt: Added explicit dependency versions
- Lazy loading for Tkinter imports (better startup performance)
- Improved error handling in PDF download
- Upgraded OpenAI SDK from v0.28 to v1.x+
- Added pytest for testing
- Article Search: CrossRef API integration
- Search by query keywords
- Configurable result count
- HTML export of results
- PDF Download: Multiple download methods
- Direct URL download
- Unpaywall API fallback for open access
- Manual browser fallback
- NLP Processing: spaCy-based text analysis
- PDF text extraction (pdfplumber)
- Text preprocessing (URL removal, normalization)
- Heading detection (title-cased sentences)
- Section splitting
- Keyword analysis for trading signals and risk management
- Code Generation: OpenAI GPT-4 integration
- Strategy summarization
- QuantConnect algorithm generation
- AST validation
- Iterative refinement (up to 6 attempts)
- Tkinter GUI: Desktop interface
- Search panel with results table
- Summary display with copy/save
- Code display with syntax highlighting (Monokai theme)
- CLI Commands
search <query>- Search articleslist- Show cached resultsdownload <id>- Download PDFsummarize <id>- Generate summarygenerate-code <id>- Generate algorithmopen-article <id>- Open in browserinteractive- Launch GUI
- Python 3.8+
- OpenAI SDK v0.28 (legacy)
- pdfplumber 0.10+
- spaCy 3.x with en_core_web_sm
- Click 8.x
- python-dotenv
- Pygments
- InquirerPy
main ────●──────────────────────────────●──────────────▶
│ │
v1.0 v1.1
(legacy) (LLM client +
validator)
▲
│
beta ───────────────────────────────────┘
develop ──────────────────────────────────────────────▶
▲ (v2.0)
│
gamma ─┘
-
Update OpenAI SDK:
pip uninstall openai pip install openai>=1.0.0 -
Ensure
OPENAI_API_KEYenvironment variable is set -
No CLI command changes required
-
Package renamed:
pip uninstall quantcli pip install quantcoder
-
CLI command prefix changes:
# Old quantcli search "query" # New quantcoder search "query"
-
New commands available:
quantcoder auto start --query "..." quantcoder library build