Claude re-reads your code every session. Make it stop. Save 70%+ on tokens. Local MCP server with AST indexing, hybrid search, and cross-session memory.
-
Updated
May 4, 2026 - Python
Claude re-reads your code every session. Make it stop. Save 70%+ on tokens. Local MCP server with AST indexing, hybrid search, and cross-session memory.
Deep code indexing MCP server for AI agents. 25 tools: hybrid FTS5 + embedding search, call graphs, git blame/hotspots, build system analysis. Multi-repo workspaces, GPU-accelerated semantic search, 10 languages via tree-sitter. Fully local, zero cloud dependencies.
Convert source code to LLM ready knowledge base
MCP Server for persistent code indexing. Gives AI assistants (Claude, Gemini, Copilot, Cursor) instant access to your codebase. 50x less context than grep.
A semantic code search tool for intelligent, cross-repo context retrieval.
Memory-aware context engine for AI coding agents — up to 91% fewer tokens, 17/18 rank-1 across 6 OSS projects. MCP-native, multi-repo, with persistent observations & decisions.
Multi-agent orchestration, persistent memory, and intelligent workflows for AI coding assistants. Supports Claude Code and OpenCode.
Give Claude Code a permanent memory — 100% local, zero config, graph-powered
Self-hosted MCP server for hybrid semantic code search and repository intelligence.
Enhanced fork of claude-context with stability fixes, improved sync, and better reliability for semantic code search
Python application to index code locally and support running server with indexed repos. Works with VoyageAI to power semantic searching a large codebase, enabling AI optimized code navigation. Supports FTS searching, and indexing git log. Experimental support for SCIP indexing.
Intelligent code indexing MCP server. 13 tools, 10 languages, hybrid search, call graphs, O(1) symbol retrieval.
Go-based MCP server for codebase indexing and semantic search (Augment-compatible)
Code indexing examples for converting source code into structured repository maps optimized for semantic search LLM understanding.
Offline-first coding agent for local LLMs (LM Studio + MCP). Project-aware context, memory, and filesystem tooling for real coding workflows directly in your codebase.
Structured code retrieval for AI agents — index once with tree-sitter, query symbols precisely via MCP. Cut code-reading token costs by up to 99%.
Local context cache for LLM agents. 100% offline, zero dependencies.
CLI-first local wrapper for codebase-memory-mcp with optional Codex skill integration.
An AI-powered system for intelligent code search, moving beyond keywords to semantic understanding. It offers multi-dimensional search capabilities across files, classes/interfaces, and methods, each with optimized AI-generated embeddings. Get precise, context-aware results to natural language queries quickly and efficiently.
High-performance Rust MCP server that indexes codebases via vector + graph databases for AI coding agents
Add a description, image, and links to the code-indexing topic page so that developers can more easily learn about it.
To associate your repository with the code-indexing topic, visit your repo's landing page and select "manage topics."