Nexarag is an open-source platform for building knowledge graphs from research papers and querying them with AI, enabling transparent and reproducible literature analysis without the hallucinations of traditional RAG systems. Deploy locally with full privacy control or integrate with any LLM via the standardized Model Context Protocol (MCP).
See also:
We’d love your feedback on Nexarag, including bug reports, feature requests, documentation fixes, and tutorial or training-material ideas.
-
Bugs and feature requests: open an issue on GitHub: https://github.com/KevinMoonLab/Nexarag/issues
Please include:- what you expected vs. what happened
- steps to reproduce (or a minimal example)
- OS + Docker version (and GPU details if relevant)
- relevant logs or screenshots
-
Private or security-sensitive reports: email us at
nexarag.ai@gmail.com
If possible, include the same details as above and note why it should not be posted publicly.
- Docker
- (Windows Only) WSL2
- (MacOS Only) Ollama Desktop
- (Optional) Claude Desktop (for MCP)
Choose a Dockerfile compatible with your OS and hardware:
[Optional]: Move the Docker compose file to a location on your drive, e.g. ~/Nexarag.
From the same directory as the downloaded docker-compose.yml, run:
docker compose up -dVisit Nexarag in your browser at http://localhost:5000 (or 5100 on MacOS).
To support all internal features, Nexarag requires:
- An embedding model, such as
nomic-embed-text:v1.5 - A language model, such as
gemma3:1b - An MCP-capable model, such as
qwen3:8b
Browse the full library of Ollama models here and choose any model from these families that your hardware supports. Defaults that run on most hardware are provided below.
Models can be pulled through the command line in the ollama Docker container.
docker exec -it nexarag.ollama /bin/bash
ollama pull nomic-embed-text:v1.5
ollama pull gemma3:1b
ollama pull qwen3:8bPull models directly from your command line.
ollama pull nomic-embed-text:v1.5
ollama pull gemma3:1b
ollama pull qwen3:8bAdd the following to your Claude Desktop Config:
{
"mcpServers": {
"nexarag": {
"command": "npx",
"args": [
"-y",
"mcp-remote",
"http://localhost:9000/mcp"
],
"env": {
"MCP_TRANSPORT_STRATEGY": "http-only"
}
}
}
}First install pipx, then run:
pipx install ollmcpTo start the MCP client:
ollmcp -u http://localhost:9000/mcp -m qwen3:8bPlease note that we are rate-limited by the Semantic Scholar API, so enriching BibTex uploads with data and updating the graph after adding papers from a Semantic Scholar search may take several minutes to complete.
