code-offline lets you run an AI coding assistant completely on your computer. It does not need an internet connection or external services. The software uses local container technology so you can run private AI models on your CPU or NVIDIA GPU.
This means your code and data stay private on your machine. The AI models come from pi and llama.cpp, known tools for running language models offline. You get smart coding help without sending your info through online APIs.
The design focuses on easy setup and using your existing Windows computer. You don’t need programming skills or special accounts to start using code-offline.
Make sure your PC meets these requirements before installing code-offline:
- Operating system: Windows 10 or higher
- Processor: Intel or AMD CPU, 4 or more cores for better speed
- RAM: At least 8 GB recommended
- GPU: Optional, NVIDIA GPU if you want faster AI processing
- Disk space: Minimum 2 GB free for installation and models
- Docker Desktop: Required for container support (installation guide below)
You can use code-offline on CPU only, but performance improves with an NVIDIA GPU.
To get the latest version of code-offline, visit the official release page:
Follow these steps:
- Open the link above in your web browser.
- Scroll to the latest release.
- Find the Windows installer file. Look for a
.exeor.zippackage. - Click the file name to download it.
- Save it in a folder you can easily access, like your Desktop or Downloads.
Downloading from the official GitHub releases page ensures you get the latest stable software and updates.
After downloading, follow these instructions:
code-offline uses containers, so Docker Desktop must be installed. Here’s how:
- Go to https://www.docker.com/products/docker-desktop
- Download the Windows version.
- Run the installer and follow the prompts.
- After installation, restart your computer if requested.
- Open Docker Desktop and let it finish setting up.
Docker allows code-offline to run the AI models safely and efficiently without complicated setup.
If you downloaded an installer (.exe):
- Double-click the installer file.
- Follow the on-screen instructions.
- Accept the license terms.
- Choose the install location or use the default path.
- Wait for the process to complete.
- When done, click Finish.
If you downloaded a zip file:
- Extract the contents into a folder.
- Find the executable file (
code-offline.exe) inside the extracted folder. - Double-click to run the program.
- Open the installed app by clicking the shortcut created during installation or running the executable.
- The first time you start it, the app will prepare the necessary AI models inside containers.
- This may take several minutes, depending on your system.
- After setup finishes, a main window will open, showing the AI coding assistant interface.
Once it runs, code-offline works offline on your machine. Here is how to begin using it:
- Type your coding questions or instructions in the chat window.
- The AI will respond based on the local models installed.
- You can ask for code snippets, explanations, and suggestions related to your projects.
- The interface supports common programming languages and workflows.
The AI runs inside a lightweight container, so it will not affect other programs on your PC.
You can control performance based on your system:
- Choose CPU mode if you do not have an NVIDIA GPU.
- Enable GPU processing in settings if you have a compatible NVIDIA graphics card.
- Adjust memory and container resources within Docker Desktop if needed.
To get updates:
- Visit https://github.com/VicNa559/code-offline/releases regularly.
- Download new installer versions for stability and new features.
- Follow the install steps above to apply updates.
You do not lose your settings or data when upgrading.
If you encounter issues:
- Make sure Docker Desktop is running.
- Restart your PC if containers do not start properly.
- Check that system requirements are met.
- Review Docker’s resource allocation to ensure enough CPU and RAM.
- Consult the GitHub Issues page for known problems or add a new issue if you find bugs.
- The GitHub page contains detailed information on models supported and configuration options.
- Instructions for advanced users to add custom models or scripts are available on the official repo.
- Community forums and user guides can be explored for further assistance.
Download code-offline and run this tool without worrying about internet or privacy risks.