What Changed
- Added a diff-aware pre-analysis step that extracts per-file stats, rename/add/delete info, likely scope/type hints, and “technical anchors” (identifiers, config keys, dependency names), then feeds the model DIFF_SUMMARY + RAW_DIFF instead of raw diff alone: msg-generator.ts:L1-L214
- Upgraded the base instructions so the subject is required to be concrete/technical (must mention at least one real artifact) and no longer forces everything to lowercase (keeps OAuth/HTTP/JSON casing, identifiers, etc.): langInstruction.ts
- Unified output cleanup across all generators (ChatGPT/Gemini/Ollama/LMStudio/Smithery/Custom) using the same post-processor, and added a fallback when the model returns generic “classic” subjects: msg-generator.ts:L191-L229
- Fixed Issue #70
Better “Results” Handling
- Fixed multi-result selection so the chosen quick-pick result is actually written to the commit message file (previously selection didn’t affect output): generate-ai-commit.ts
- Enabled true multi-result output for the ChatGPT generator when useMultipleResults is enabled (returns a de-duplicated array): chatgpt-msg-generator.ts
- Updated the flow to accept string | string[] from generators and to write the selected message: generate-completion-flow.ts
Fixed
- Removed the OpenAI SDK import entirely and switched the ChatGPT generator to call the OpenAI Chat Completions REST endpoint via node-fetch (same approach style as other generators).
- This avoids SDK export mismatches and bundles cleanly with esbuild.
Others
- chore(deps): bump execa from 9.6.0 to 9.6.1 by @dependabot[bot] in #71
- chore(deps-dev): bump @types/node from 22.18.1 to 25.0.9 by @dependabot[bot] in #72
- chore(deps-dev): bump @vscode/vsce from 3.6.2 to 3.7.1 by @dependabot[bot] in #73
- chore(deps-dev): bump @types/vscode from 1.102.0 to 1.108.1 by @dependabot[bot] in #74
- chore(deps): bump openai from 3.3.0 to 6.16.0 by @dependabot[bot] in #75
Full Changelog: v2.1.1...v2.1.2