π€ Powered by OSS-20B | π 100% Offline & Private | π» VSCode Ready | β‘ One-Command Setup
MCPlease is an offline AI coding assistant that runs entirely on your machine. It provides intelligent code completion, explanations, and debugging help using a locally-hosted language model, ensuring your code never leaves your machine.
Before MCPlease: Multiple setup steps, manual configuration, restarting processes
With MCPlease: Download model once, then run ./start.sh for instant AI coding
Built for AI-native builders who want to keep building without cloud dependencies or credit systems.
- π Complete Privacy: All AI processing happens locally - your code never leaves your machine
- β‘ Simple Setup: Download model once, then
./start.shfor instant AI coding - π§ Professional AI: Full OSS-20B model for production-quality coding assistance
- π» Cross-Platform: Works on macOS, Linux, and Windows
- π VSCode Ready: Seamless integration with Continue.dev extension
- π AI-Native: Built for developers who code with AI
python download_model.pyThis is required - MCPlease needs the OSS-20B model to provide AI coding assistance.
./start.sh- Download the model:
python download_model.py(required) - Start the system:
./start.sh - Wait for VSCode to open
- Press
Ctrl+Iin VSCode to open Continue.dev - Start coding with AI assistance!
Note: The fallback responses are just for testing - real AI coding assistance requires the full model.
- Python 3.9+
- VSCode with Continue.dev extension
- Any available port (automatically finds one)
- 15GB+ free disk space (for OSS-20B model)
- Stable internet connection (for model download)
- Patience (10-30 minutes download time)
- Full OSS-20B model for AI coding assistance
- Offline AI coding - no cloud dependencies
- Professional-grade AI responses for building
- Continue.dev integration in VSCode
Automatic! No manual configuration needed.
- Download the model:
python download_model.py(required first) - Run
./start.sh- This configures everything automatically - VSCode opens with Continue.dev already configured
- Press
Ctrl+Ito start using AI assistance - Start coding immediately!
The setup script handles all configuration automatically.
def fibonacci(n):
# AI will complete this functionSelect any code and ask: "What does this function do?"
Paste error messages and get intelligent debugging suggestions.
If you prefer manual setup:
# Clone repository
git clone https://github.com/your-org/mcplease.git
cd mcplease
# Start server manually
python mcplease_http_server.py
# Configure VSCode manually
# (See .vscode/settings.json for configuration)- Response Time: Instant with fallback responses
- Memory Usage: Minimal (fallback mode)
- Setup Time: ~10 seconds total
- Context Length: Unlimited (fallback mode)
βββββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
β VSCode IDE βββββΊβ HTTP Server βββββΊβ AI Responses β
β (Continue.dev) β β (Port 8000) β β (Fallback) β
βββββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
β
ββββββββββββββββββββ
β VSCode Config β
β (Auto-setup) β
ββββββββββββββββββββ
python download_model.py- Downloads the full model (~13GB)
- Automatically checks disk space
- Resumes interrupted downloads
- Required for AI coding assistance
ls -la models/gpt-oss-20b/- Verify model files are present
- Check total size (~13GB)
rm -rf models/gpt-oss-20b/- Frees up ~13GB of disk space
- Breaks AI functionality - only use if you know what you're doing
# The system automatically finds an available port
# If you need a specific port, set the PORT environment variable:
PORT=9000 ./start.shpython mcplease_http_server.py# Edit .vscode/settings.json for custom settingsmcplease/
βββ start.sh # One-command setup script
βββ mcplease_http_server.py # HTTP server for Continue.dev
βββ .vscode/settings.json # VSCode configuration
βββ tests/ # Test suite
βββ examples/ # Usage examples
python -m pytest tests/ -vpython test_server.pyWe welcome contributions!
- Fork the repository
- Create a feature branch:
git checkout -b feature-name - Make your changes and add tests
- Submit a pull request
This project is licensed under the MIT License.
- Slow download: Normal for 13GB file, be patient
- Download interrupted: Just run
python download_model.pyagain - it will resume - Not enough disk space: Free up at least 15GB before downloading
- Network errors: Check your internet connection and try again
- Port conflicts: Automatically resolved - the system finds an available port
- Continue.dev not working: Restart VSCode after running
./start.sh
- Documentation: Check our documentation files
- Issues: Report bugs on GitHub Issues
- Questions: Open a GitHub Discussion
- Continue.dev for the excellent VSCode integration
- FastAPI for the high-performance HTTP server
- VSCode for the extensible IDE platform