diff --git a/.gitignore b/.gitignore index cd800581..16c99a57 100644 --- a/.gitignore +++ b/.gitignore @@ -189,6 +189,9 @@ dmypy.json # Cython debug symbols cython_debug/ +# Codex +.codex/ + # PyCharm # JetBrains specific template is maintained in a separate JetBrains.gitignore that can # be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore diff --git a/README.md b/README.md index e05f8dfe..954bbf32 100644 --- a/README.md +++ b/README.md @@ -35,6 +35,7 @@ Perfect for building **RAG pipelines** with real-time retrieval, **AI agents** w | **[Vector Search](#retrieval)**
*Similarity search with metadata filters* | **[LLM Memory](#llm-memory)**
*Agentic AI context management* | **Async Support**
*Async indexing and search for improved performance* | | **[Complex Filtering](#retrieval)**
*Combine multiple filter types* | **[Semantic Routing](#semantic-routing)**
*Intelligent query classification* | **[Vectorizers](#vectorizers)**
*8+ embedding provider integrations* | | **[Hybrid Search](#retrieval)**
*Combine semantic & full-text signals* | **[Embedding Caching](#embedding-caching)**
*Cache embeddings for efficiency* | **[Rerankers](#rerankers)**
*Improve search result relevancy* | +| | | **[MCP Server](#mcp-server)**
*Expose an existing Redis index to MCP clients* | @@ -50,7 +51,16 @@ Install `redisvl` into your Python (>=3.9) environment using `pip`: pip install redisvl ``` +Install the MCP server extra when you want to expose an existing Redis index through MCP: + +```bash +pip install redisvl[mcp] +``` + +The `redisvl[mcp]` extra requires Python 3.10 or newer. + > For more detailed instructions, visit the [installation guide](https://docs.redisvl.com/en/latest/user_guide/installation.html). +> For MCP concepts and setup, see the [RedisVL MCP docs](https://docs.redisvl.com/en/latest/concepts/mcp.html) and the [MCP how-to guide](https://docs.redisvl.com/en/latest/user_guide/how_to_guides/mcp.html). ## Redis @@ -525,11 +535,45 @@ usage: rvl [] Commands: index Index manipulation (create, delete, etc.) + mcp Run the RedisVL MCP server version Obtain the version of RedisVL stats Obtain statistics about an index ``` -> Read more about [using the CLI](https://docs.redisvl.com/en/latest/overview/cli.html). +Run the MCP server over stdio with: + +```bash +uvx --from redisvl[mcp] rvl mcp --config /path/to/mcp.yaml +``` + +Use `--read-only` to expose search without upsert. + +> Read more about [using the CLI](https://docs.redisvl.com/en/latest/overview/cli.html) and [running RedisVL MCP](https://docs.redisvl.com/en/latest/user_guide/how_to_guides/mcp.html). + +### MCP Server + +RedisVL includes an MCP server that lets MCP-compatible clients search or upsert data in an existing Redis index through a small, stable tool contract. + +The server: + +- connects to one existing Redis Search index +- reconstructs the schema from Redis at startup +- uses the configured vectorizer for query embedding and optional upsert embedding +- exposes `search-records` and, unless read-only mode is enabled, `upsert-records` + +Run it over stdio with: + +```bash +uvx --from redisvl[mcp] rvl mcp --config /path/to/mcp.yaml +``` + +Use `--read-only` when clients should only search: + +```bash +uvx --from redisvl[mcp] rvl mcp --config /path/to/mcp.yaml --read-only +``` + +For configuration details, tool arguments, and examples, see the [RedisVL MCP docs](https://docs.redisvl.com/en/latest/concepts/mcp.html) and the [MCP how-to guide](https://docs.redisvl.com/en/latest/user_guide/how_to_guides/mcp.html). ## 🚀 Why RedisVL? @@ -542,6 +586,7 @@ Built on the [Redis Python](https://github.com/redis/redis-py/tree/master) clien For additional help, check out the following resources: - [Getting Started Guide](https://docs.redisvl.com/en/stable/user_guide/01_getting_started.html) +- [RedisVL MCP](https://docs.redisvl.com/en/latest/concepts/mcp.html) - [API Reference](https://docs.redisvl.com/en/stable/api/index.html) - [Redis AI Recipes](https://github.com/redis-developer/redis-ai-resources) diff --git a/docs/concepts/index.md b/docs/concepts/index.md index 0e522b1a..a68d0802 100644 --- a/docs/concepts/index.md +++ b/docs/concepts/index.md @@ -47,6 +47,13 @@ Vector, filter, text, hybrid, and multi-vector query options. Vectorizers for embeddings and rerankers for result optimization. ::: +:::{grid-item-card} 🧠 MCP +:link: mcp +:link-type: doc + +How RedisVL exposes an existing Redis index to MCP clients through a stable tool contract. +::: + :::{grid-item-card} 🧩 Extensions :link: extensions :link-type: doc @@ -65,5 +72,6 @@ search-and-indexing field-attributes queries utilities +mcp extensions ``` diff --git a/docs/concepts/mcp.md b/docs/concepts/mcp.md new file mode 100644 index 00000000..854b6a91 --- /dev/null +++ b/docs/concepts/mcp.md @@ -0,0 +1,102 @@ +--- +myst: + html_meta: + "description lang=en": | + RedisVL MCP concepts: how the RedisVL MCP server exposes an existing Redis index to MCP clients. +--- + +# RedisVL MCP + +RedisVL includes an MCP server that exposes a Redis-backed retrieval surface through a small, deterministic tool contract. It is designed for AI applications that want to search or maintain data in an existing Redis index without each client reimplementing Redis query logic. + +## What RedisVL MCP Does + +The RedisVL MCP server sits between an MCP client and Redis: + +1. It connects to an existing Redis Search index. +2. It inspects that index at startup and reconstructs its schema. +3. It instantiates the configured vectorizer for query embedding and optional upsert embedding. +4. It exposes stable MCP tools for search, and optionally upsert. + +This keeps the Redis index as the source of truth for search behavior while giving MCP clients a predictable interface. + +## How RedisVL MCP Runs + +RedisVL MCP works with a focused model: + +- One server process binds to exactly one existing Redis index. +- The server uses stdio transport. +- Search behavior is owned by configuration, not by MCP callers. +- The vectorizer is configured explicitly. +- Upsert is optional and can be disabled with read-only mode. + +## Config-Owned Search Behavior + +MCP callers can control: + +- `query` +- `limit` +- `offset` +- `filter` +- `return_fields` + +MCP callers do not choose: + +- which index to target +- whether retrieval is `vector`, `fulltext`, or `hybrid` +- query tuning parameters such as hybrid fusion or vector runtime settings + +That behavior lives in the server config under `indexes..search`. The response includes `search_type` as informational metadata, but it is not a request parameter. + +## Single Index Binding + +The YAML config uses an `indexes` mapping with one configured entry. That binding points to an existing Redis index through `redis_name`, and every tool call targets that configured index. + +## Schema Inspection and Overrides + +RedisVL MCP is inspection-first: + +- the Redis index must already exist +- the server reconstructs the schema from Redis metadata at startup +- runtime field mappings remain explicit in config + +In some environments, Redis metadata can be incomplete for vector field attributes. When that happens, `schema_overrides` can patch missing attrs for fields that were already discovered. It does not create new fields or change discovered field identity. + +## Read-Only and Read-Write Modes + +RedisVL MCP always registers `search-records`. + +`upsert-records` is only registered when the server is not in read-only mode. Read-only mode is controlled by: + +- the CLI flag `--read-only` +- or the environment variable `REDISVL_MCP_READ_ONLY=true` + +Use read-only mode when Redis is serving approved content to assistants and another system owns ingestion. + +## Tool Surface + +RedisVL MCP exposes two tools: + +- `search-records` searches the configured index using the server-owned search mode +- `upsert-records` validates and upserts records, embedding them when needed + +These tools follow a stable contract: + +- request validation happens before query or write execution +- filters support either raw strings or a RedisVL-backed JSON DSL +- error codes are mapped into a stable set of MCP-facing categories + +## Why Use MCP Instead of Direct RedisVL Calls + +Use RedisVL MCP when you want a standard tool boundary for agent frameworks or assistants that already speak MCP. + +Use direct RedisVL client code when your application should own index lifecycle, search construction, data loading, or richer RedisVL features directly in Python. + +RedisVL MCP is a good fit when: + +- multiple assistants should share one approved retrieval surface +- you want search behavior fixed by deployment config +- you need a read-only or tightly controlled write boundary +- you want to reuse an existing Redis index without rebuilding retrieval logic in every client + +For setup steps, config, commands, and examples, see {doc}`/user_guide/how_to_guides/mcp`. diff --git a/docs/user_guide/how_to_guides/index.md b/docs/user_guide/how_to_guides/index.md index c03d705d..fd24fbfc 100644 --- a/docs/user_guide/how_to_guides/index.md +++ b/docs/user_guide/how_to_guides/index.md @@ -39,6 +39,7 @@ How-to guides are **task-oriented** recipes that help you accomplish specific go :::{grid-item-card} 💻 CLI Operations - [Manage Indices with the CLI](../cli.ipynb) -- create, inspect, and delete indices from your terminal +- [Run RedisVL MCP](mcp.md) -- expose an existing Redis index to MCP clients ::: :::: @@ -59,6 +60,7 @@ How-to guides are **task-oriented** recipes that help you accomplish specific go | Optimize index performance | [Optimize Indexes with SVS-VAMANA](../09_svs_vamana.ipynb) | | Decide on storage format | [Choose a Storage Type](../05_hash_vs_json.ipynb) | | Manage indices from terminal | [Manage Indices with the CLI](../cli.ipynb) | +| Expose an index through MCP | [Run RedisVL MCP](mcp.md) | ```{toctree} :hidden: @@ -74,4 +76,5 @@ Optimize Indexes with SVS-VAMANA <../09_svs_vamana> Cache Embeddings <../10_embeddings_cache> Use Advanced Query Types <../11_advanced_queries> Write SQL Queries for Redis <../12_sql_to_redis_queries> +Run RedisVL MCP ``` diff --git a/docs/user_guide/how_to_guides/mcp.md b/docs/user_guide/how_to_guides/mcp.md new file mode 100644 index 00000000..d5aef922 --- /dev/null +++ b/docs/user_guide/how_to_guides/mcp.md @@ -0,0 +1,402 @@ +--- +myst: + html_meta: + "description lang=en": | + How to run the RedisVL MCP server, configure it, and use its search and upsert tools. +--- + +# Run RedisVL MCP + +This guide shows how to run the RedisVL MCP server against an existing Redis index, configure its behavior, and use the MCP tools it exposes. + +For the higher-level design, see {doc}`/concepts/mcp`. + +## Before You Start + +RedisVL MCP assumes all of the following are already true: + +- you have Python 3.10 or newer +- you have Redis with Search capabilities available +- the Redis index already exists +- you know which text field and vector field the server should use +- you have installed the vectorizer provider dependencies your config needs + +Install the MCP extra: + +```bash +pip install redisvl[mcp] +``` + +If your vectorizer needs a provider extra, install that too: + +```bash +pip install redisvl[mcp,openai] +``` + +## Start the Server + +Run the server over stdio: + +```bash +uvx --from redisvl[mcp] rvl mcp --config /path/to/mcp.yaml +``` + +Run it in read-only mode to expose search without upsert: + +```bash +uvx --from redisvl[mcp] rvl mcp --config /path/to/mcp.yaml --read-only +``` + +You can also control boot settings through environment variables: + +| Variable | Purpose | +|----------|---------| +| `REDISVL_MCP_CONFIG` | Path to the MCP YAML config | +| `REDISVL_MCP_READ_ONLY` | Disable `upsert-records` when set to `true` | +| `REDISVL_MCP_TOOL_SEARCH_DESCRIPTION` | Override the search tool description | +| `REDISVL_MCP_TOOL_UPSERT_DESCRIPTION` | Override the upsert tool description | + +## Example Config + +This example binds one logical MCP server to one existing Redis index called `knowledge`. + +The config uses `${REDIS_URL}` and `${OPENAI_API_KEY}` as environment-variable placeholders. These values are resolved when the server starts. You can also use `${VAR:-default}` to provide a fallback value. + +```yaml +server: + redis_url: ${REDIS_URL} + +indexes: + knowledge: + redis_name: knowledge + + vectorizer: + class: OpenAITextVectorizer + model: text-embedding-3-small + api_config: + api_key: ${OPENAI_API_KEY} + + schema_overrides: + fields: + - name: embedding + type: vector + attrs: + dims: 1536 + datatype: float32 + + search: + type: hybrid + params: + text_scorer: BM25STD + stopwords: english + vector_search_method: KNN + combination_method: LINEAR + linear_text_weight: 0.3 + + runtime: + text_field_name: content + vector_field_name: embedding + default_embed_text_field: content + default_limit: 10 + max_limit: 25 + max_upsert_records: 64 + skip_embedding_if_present: true + startup_timeout_seconds: 30 + request_timeout_seconds: 60 + max_concurrency: 16 +``` + +### What This Config Means + +- `redis_name` must point to an index that already exists in Redis +- `search.type` fixes retrieval behavior for every MCP caller +- `runtime.text_field_name` tells full-text and hybrid search which field to search +- `runtime.vector_field_name` tells the server which vector field to use +- `runtime.default_embed_text_field` tells upsert which text field to embed when a record needs embedding +- `schema_overrides` is only for patching incomplete field attrs discovered from Redis + +## Tool Contracts + +RedisVL MCP exposes a small, implementation-owned contract. + +### `search-records` + +Arguments: + +- `query` +- `limit` +- `offset` +- `filter` +- `return_fields` + +Example request payload: + +```json +{ + "query": "incident response runbook", + "limit": 2, + "offset": 0, + "filter": { + "and": [ + { "field": "category", "op": "eq", "value": "operations" }, + { "field": "rating", "op": "gte", "value": 4 } + ] + }, + "return_fields": ["title", "content", "category", "rating"] +} +``` + +Example response payload: + +```json +{ + "search_type": "hybrid", + "offset": 0, + "limit": 2, + "results": [ + { + "id": "knowledge:runbook:eu-failover", + "score": 0.82, + "score_type": "hybrid_score", + "record": { + "title": "EU failover runbook", + "content": "Restore traffic after a regional failover.", + "category": "operations", + "rating": 5 + } + } + ] +} +``` + +Notes: + +- `search_type` is response metadata, not a request argument +- when `return_fields` is omitted, RedisVL MCP returns all non-vector fields +- returning the configured vector field is rejected +- `filter` accepts either a raw string or a JSON DSL object + +### `upsert-records` + +Arguments: + +- `records` +- `id_field` +- `skip_embedding_if_present` + +Example request payload: + +```json +{ + "records": [ + { + "doc_id": "doc-42", + "content": "Updated operational guidance for failover handling.", + "category": "operations", + "rating": 5 + } + ], + "id_field": "doc_id" +} +``` + +Example response payload: + +```json +{ + "status": "success", + "keys_upserted": 1, + "keys": ["knowledge:doc-42"] +} +``` + +Notes: + +- this tool is not registered in read-only mode +- records that need embedding must contain `runtime.default_embed_text_field` +- when `skip_embedding_if_present` is `true`, records that already contain the vector field can skip re-embedding + +## Search Examples + +### Read-Only Vector Search + +Use read-only mode when assistants should only retrieve data: + +```bash +uvx --from redisvl[mcp] rvl mcp --config /path/to/mcp.yaml --read-only +``` + +With a `search.type` of `vector`, callers send only the query, filters, pagination, and field projection: + +```json +{ + "query": "cache invalidation incident", + "limit": 3, + "return_fields": ["title", "content", "category"] +} +``` + +### Raw String Filter + +Pass a raw Redis filter string through unchanged: + +```json +{ + "query": "science", + "filter": "@category:{science}", + "return_fields": ["content", "category"] +} +``` + +### JSON DSL Filter + +The DSL supports logical operators and type-checked field operators: + +```json +{ + "query": "science", + "filter": { + "and": [ + { "field": "category", "op": "eq", "value": "science" }, + { "field": "rating", "op": "gte", "value": 4 } + ] + }, + "return_fields": ["content", "category", "rating"] +} +``` + +### Pagination and Field Projection + +```json +{ + "query": "science", + "limit": 1, + "offset": 1, + "return_fields": ["content", "category"] +} +``` + +### Hybrid Search With `schema_overrides` + +Use `schema_overrides` when Redis inspection cannot recover complete vector attrs, then keep hybrid behavior in config: + +```yaml +schema_overrides: + fields: + - name: embedding + type: vector + attrs: + algorithm: flat + dims: 1536 + datatype: float32 + distance_metric: cosine + +search: + type: hybrid + params: + text_scorer: BM25STD + stopwords: english + vector_search_method: KNN + combination_method: LINEAR + linear_text_weight: 0.3 +``` + +The MCP caller still sends the same request shape: + +```json +{ + "query": "legacy cache invalidation flow", + "filter": { "field": "category", "op": "eq", "value": "release-notes" }, + "return_fields": ["title", "content", "release_version"] +} +``` + +## Upsert Examples + +### Auto-Embed New Records + +If a record does not include the configured vector field, RedisVL MCP embeds `runtime.default_embed_text_field` and writes the result: + +```json +{ + "records": [ + { + "content": "First upserted document", + "category": "science", + "rating": 5 + }, + { + "content": "Second upserted document", + "category": "health", + "rating": 4 + } + ] +} +``` + +### Update Existing Records With `id_field` + +```json +{ + "records": [ + { + "doc_id": "doc-1", + "content": "Updated content", + "category": "engineering", + "rating": 5 + } + ], + "id_field": "doc_id" +} +``` + +### Control Re-Embedding With `skip_embedding_if_present` + +```json +{ + "records": [ + { + "doc_id": "doc-2", + "content": "Existing content", + "category": "science", + "rating": 4 + } + ], + "id_field": "doc_id", + "skip_embedding_if_present": false +} +``` + +Set `skip_embedding_if_present` to `false` when you want the server to regenerate embeddings during upsert. In most cases, the caller should omit the vector field and let the server manage embeddings from `runtime.default_embed_text_field`. + +## Troubleshooting + +### Missing MCP Dependencies + +If `rvl mcp` reports missing optional dependencies, install the MCP extra: + +```bash +pip install redisvl[mcp] +``` + +If the configured vectorizer needs a provider SDK, install that provider extra too. + +### Unsupported Python Runtime + +RedisVL MCP requires Python 3.10 or newer even though the core package supports Python 3.9. Use a newer interpreter for the MCP server process. + +### Configured Redis Index Does Not Exist + +The server only binds to an existing index. Create the index first, then point `indexes..redis_name` at that index name. + +### Missing Required Environment Variables + +YAML values support `${VAR}` and `${VAR:-default}` substitution. Missing required variables fail startup before the server registers tools. + +### Vectorizer Dimension Mismatch + +If the vectorizer dims do not match the configured vector field dims, startup fails. Make sure the embedding model and the effective vector field dimensions are aligned. + +### Hybrid Config Requires Native Runtime Support + +Some hybrid params depend on native hybrid support in Redis and redis-py. If your environment does not support that path, remove native-only params such as `knn_ef_runtime` or upgrade Redis and redis-py. diff --git a/docs/user_guide/index.md b/docs/user_guide/index.md index 5d2cf6df..c9be86d2 100644 --- a/docs/user_guide/index.md +++ b/docs/user_guide/index.md @@ -39,7 +39,17 @@ Schema → Index → Load → Query **Solve specific problems.** Task-oriented recipes for LLM extensions, querying, embeddings, optimization, and storage. +++ -LLM Caching • Filtering • Vectorizers • Reranking +LLM Caching • Filtering • MCP • Reranking +::: + +:::{grid-item-card} 🧠 MCP Setup +:link: how_to_guides/mcp +:link-type: doc + +**Expose Redis through MCP.** Run the RedisVL MCP server, configure one existing index, and use search or optional upsert tools. + ++++ +stdio transport • One index • Search and upsert ::: :::{grid-item-card} 💻 CLI Reference diff --git a/docs/user_guide/installation.md b/docs/user_guide/installation.md index cfa1bb32..56704379 100644 --- a/docs/user_guide/installation.md +++ b/docs/user_guide/installation.md @@ -31,6 +31,7 @@ $ pip install redisvl[vertexai] # Google Vertex AI embeddings $ pip install redisvl[bedrock] # AWS Bedrock embeddings # Other optional features +$ pip install redisvl[mcp] # RedisVL MCP server support (Python 3.10+) $ pip install redisvl[langcache] # LangCache managed service integration $ pip install redisvl[sql-redis] # SQL query support ``` @@ -44,7 +45,7 @@ $ pip install redisvl\[openai\] You can install multiple optional dependencies at once: ```bash -$ pip install redisvl[openai,cohere,sentence-transformers] +$ pip install redisvl[mcp,openai,cohere,sentence-transformers] ``` To install **all** optional dependencies at once: @@ -53,6 +54,10 @@ To install **all** optional dependencies at once: $ pip install redisvl[all] ``` +```{note} +The core RedisVL package supports Python 3.9+, but the `redisvl[mcp]` extra requires Python 3.10 or newer because the MCP server depends on `fastmcp`. +``` + ## Install RedisVL from Source To install RedisVL from source, clone the repository and install the package using `pip`: diff --git a/pyproject.toml b/pyproject.toml index 934ef7f0..55a347f8 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -35,6 +35,10 @@ dependencies = [ ] [project.optional-dependencies] +mcp = [ + "fastmcp>=2.0.0 ; python_version >= '3.10'", + "pydantic-settings>=2.0", +] mistralai = ["mistralai>=1.0.0"] openai = ["openai>=1.1.0"] nltk = ["nltk>=3.8.1,<4"] diff --git a/redisvl/cli/main.py b/redisvl/cli/main.py index 1353192f..dbed65f3 100644 --- a/redisvl/cli/main.py +++ b/redisvl/cli/main.py @@ -14,6 +14,7 @@ def _usage(): "rvl []\n", "Commands:", "\tindex Index manipulation (create, delete, etc.)", + "\tmcp Run the RedisVL MCP server", "\tversion Obtain the version of RedisVL", "\tstats Obtain statistics about an index", ] @@ -42,6 +43,12 @@ def index(self): Index() exit(0) + def mcp(self): + from redisvl.cli.mcp import MCP + + MCP() + exit(0) + def version(self): Version() exit(0) diff --git a/redisvl/cli/mcp.py b/redisvl/cli/mcp.py new file mode 100644 index 00000000..b013b7ff --- /dev/null +++ b/redisvl/cli/mcp.py @@ -0,0 +1,136 @@ +"""CLI entrypoint for the RedisVL MCP server.""" + +import argparse +import asyncio +import inspect +import sys + + +class _MCPArgumentParser(argparse.ArgumentParser): + """ArgumentParser variant that reports usage errors with exit code 1.""" + + def error(self, message): + self.print_usage(sys.stderr) + self.exit(1, "%s: error: %s\n" % (self.prog, message)) + + +class MCP: + """Command handler for `rvl mcp`.""" + + description = "Expose a configured Redis index to MCP clients for search and optional upsert operations." + epilog = ( + "Use this command when wiring RedisVL into an MCP client.\n\n" + "Example:\n" + " uvx --from redisvl[mcp] rvl mcp --config /path/to/mcp_config.yaml" + ) + usage = "\n".join( + [ + "rvl mcp --config [--read-only]\n", + "\n", + ] + ) + + def __init__(self): + """Parse CLI arguments and run the MCP server command.""" + parser = _MCPArgumentParser( + usage=self.usage, + description=self.description, + epilog=self.epilog, + formatter_class=argparse.RawDescriptionHelpFormatter, + ) + parser.add_argument("--config", help="Path to MCP config file", required=True) + parser.add_argument( + "--read-only", + help="Disable the upsert tool", + action="store_true", + dest="read_only", + default=None, + ) + + args = parser.parse_args(sys.argv[2:]) + self._run(args) + raise SystemExit(0) + + def _run(self, args): + """Validate the environment, build the server, and serve stdio requests.""" + try: + self._ensure_supported_python() + settings_cls, server_cls = self._load_mcp_components() + settings = settings_cls.from_env( + config=args.config, + read_only=args.read_only, + ) + server = server_cls(settings) + self._run_awaitable(self._serve(server)) + except KeyboardInterrupt: + raise SystemExit(0) + except Exception as exc: + self._print_error(str(exc)) + raise SystemExit(1) + + @staticmethod + def _ensure_supported_python(): + """Fail fast when the current interpreter cannot support MCP extras.""" + if sys.version_info < (3, 10): + version = "%s.%s.%s" % ( + sys.version_info.major, + sys.version_info.minor, + sys.version_info.micro, + ) + raise RuntimeError( + "RedisVL MCP CLI requires Python 3.10 or newer. " + "Current runtime is Python %s." % version + ) + + @staticmethod + def _load_mcp_components(): + """Import optional MCP dependencies only on the `rvl mcp` code path.""" + try: + from redisvl.mcp import MCPSettings, RedisVLMCPServer + except (ImportError, ModuleNotFoundError) as exc: + raise RuntimeError( + "RedisVL MCP support requires optional dependencies. " + "Install them with `pip install redisvl[mcp]`.\n" + "Original error: %s" % exc + ) + + return MCPSettings, RedisVLMCPServer + + @staticmethod + def _run_awaitable(awaitable): + """Bridge the synchronous CLI entrypoint to async server lifecycle code.""" + return asyncio.run(awaitable) + + async def _serve(self, server): + """Run startup, stdio serving, and shutdown on one event loop.""" + started = False + + try: + await server.startup() + started = True + + # Prefer FastMCP's async transport path so startup, serving, and + # shutdown all share the same event loop. + run_async = getattr(server, "run_async", None) + if callable(run_async): + await run_async(transport="stdio") + else: + result = server.run(transport="stdio") + if inspect.isawaitable(result): + await result + finally: + if started: + try: + result = server.shutdown() + if inspect.isawaitable(result): + await result + except RuntimeError as exc: + # KeyboardInterrupt during stdio shutdown can leave FastMCP + # tearing down after the loop is already closing. + if "Event loop is closed" not in str(exc): + raise + + @staticmethod + def _print_error(message): + """Emit user-facing command errors to stderr.""" + print(message, file=sys.stderr) diff --git a/redisvl/mcp/__init__.py b/redisvl/mcp/__init__.py new file mode 100644 index 00000000..f86933e6 --- /dev/null +++ b/redisvl/mcp/__init__.py @@ -0,0 +1,14 @@ +from redisvl.mcp.config import MCPConfig, load_mcp_config +from redisvl.mcp.errors import MCPErrorCode, RedisVLMCPError, map_exception +from redisvl.mcp.server import RedisVLMCPServer +from redisvl.mcp.settings import MCPSettings + +__all__ = [ + "MCPConfig", + "MCPErrorCode", + "MCPSettings", + "RedisVLMCPError", + "RedisVLMCPServer", + "load_mcp_config", + "map_exception", +] diff --git a/redisvl/mcp/config.py b/redisvl/mcp/config.py new file mode 100644 index 00000000..af7104f5 --- /dev/null +++ b/redisvl/mcp/config.py @@ -0,0 +1,391 @@ +import os +import re +from copy import deepcopy +from pathlib import Path +from typing import Any, Dict, Literal, Optional + +import yaml +from pydantic import BaseModel, ConfigDict, Field, model_validator + +from redisvl.schema.fields import BaseField +from redisvl.schema.schema import IndexSchema + +_ENV_PATTERN = re.compile(r"\$\{([^}:]+)(?::-([^}]*))?\}") + + +class MCPRuntimeConfig(BaseModel): + """Runtime limits and validated field mappings for MCP requests.""" + + text_field_name: str = Field(..., min_length=1) + vector_field_name: str = Field(..., min_length=1) + default_embed_text_field: str = Field(..., min_length=1) + default_limit: int = 10 + max_limit: int = 100 + max_upsert_records: int = 64 + skip_embedding_if_present: bool = True + startup_timeout_seconds: int = 30 + request_timeout_seconds: int = 60 + max_concurrency: int = 16 + + @model_validator(mode="after") + def _validate_limits(self) -> "MCPRuntimeConfig": + """Validate runtime bounds during config load.""" + if self.default_limit <= 0: + raise ValueError("runtime.default_limit must be greater than 0") + if self.max_limit < self.default_limit: + raise ValueError( + "runtime.max_limit must be greater than or equal to runtime.default_limit" + ) + if self.max_upsert_records <= 0: + raise ValueError("runtime.max_upsert_records must be greater than 0") + if self.startup_timeout_seconds <= 0: + raise ValueError("runtime.startup_timeout_seconds must be greater than 0") + if self.request_timeout_seconds <= 0: + raise ValueError("runtime.request_timeout_seconds must be greater than 0") + if self.max_concurrency <= 0: + raise ValueError("runtime.max_concurrency must be greater than 0") + return self + + +class MCPVectorizerConfig(BaseModel): + """Vectorizer constructor contract loaded from YAML.""" + + model_config = ConfigDict(populate_by_name=True, extra="allow") + + class_name: str = Field(alias="class", min_length=1) + model: str = Field(..., min_length=1) + + @property + def extra_kwargs(self) -> Dict[str, Any]: + """Return vectorizer kwargs other than the normalized `class` and `model`.""" + return dict(self.model_extra or {}) + + def to_init_kwargs(self) -> Dict[str, Any]: + """Build kwargs suitable for directly instantiating the vectorizer.""" + return {"model": self.model, **self.extra_kwargs} + + +class MCPServerConfig(BaseModel): + """Server-level bootstrap configuration.""" + + redis_url: str = Field(..., min_length=1) + + +class MCPIndexSearchConfig(BaseModel): + """Configured search mode and query tuning for the bound index. + + The MCP request contract only exposes query text, filtering, pagination, and + field projection. Search mode and query-tuning behavior are owned entirely by + YAML config and validated here. + """ + + type: Literal["vector", "fulltext", "hybrid"] + params: Dict[str, Any] = Field(default_factory=dict) + + @model_validator(mode="after") + def _validate_params(self) -> "MCPIndexSearchConfig": + """Reject params that do not belong to the configured search mode.""" + allowed_params = { + "vector": { + "hybrid_policy", + "batch_size", + "ef_runtime", + "epsilon", + "search_window_size", + "use_search_history", + "search_buffer_capacity", + "normalize_vector_distance", + }, + "fulltext": { + "text_scorer", + "stopwords", + "text_weights", + }, + "hybrid": { + "text_scorer", + "stopwords", + "text_weights", + "vector_search_method", + "knn_ef_runtime", + "range_radius", + "range_epsilon", + "combination_method", + "rrf_window", + "rrf_constant", + "linear_text_weight", + }, + } + invalid_keys = sorted(set(self.params) - allowed_params[self.type]) + if invalid_keys: + raise ValueError( + "search.params contains keys incompatible with " + f"search.type '{self.type}': {', '.join(invalid_keys)}" + ) + + if ( + "linear_text_weight" in self.params + and self.params.get("combination_method") != "LINEAR" + ): + raise ValueError( + "search.params.linear_text_weight requires combination_method to be LINEAR" + ) + return self + + def to_query_params(self) -> Dict[str, Any]: + """Return normalized query kwargs exactly as configured.""" + return dict(self.params) + + def validate_runtime_capabilities( + self, *, supports_native_hybrid_search: bool + ) -> None: + """Fail startup when hybrid config depends on native-only FT.SEARCH params.""" + if ( + self.type == "hybrid" + and not supports_native_hybrid_search + and "knn_ef_runtime" in self.params + ): + raise ValueError( + "search.params.knn_ef_runtime requires native hybrid search support" + ) + + +class MCPSchemaOverrideField(BaseModel): + """Allowed schema override fragment for one already-discovered field.""" + + name: str = Field(..., min_length=1) + type: str = Field(..., min_length=1) + path: Optional[str] = None + attrs: Dict[str, Any] = Field(default_factory=dict) + + +class MCPSchemaOverrides(BaseModel): + """Optional field-level schema patches used to fill inspection gaps.""" + + fields: list[MCPSchemaOverrideField] = Field(default_factory=list) + + +class MCPIndexBindingConfig(BaseModel): + """The sole configured v1 index binding.""" + + redis_name: str = Field(..., min_length=1) + vectorizer: MCPVectorizerConfig + search: MCPIndexSearchConfig + runtime: MCPRuntimeConfig + schema_overrides: MCPSchemaOverrides = Field(default_factory=MCPSchemaOverrides) + + +class MCPConfig(BaseModel): + """Validated MCP server configuration loaded from YAML.""" + + server: MCPServerConfig + indexes: Dict[str, MCPIndexBindingConfig] + + @model_validator(mode="after") + def _validate_bindings(self) -> "MCPConfig": + """Validate that there is exactly one configured logical binding.""" + if len(self.indexes) != 1: + raise ValueError( + "indexes must contain exactly one configured index binding" + ) + + binding_id = next(iter(self.indexes)) + if not binding_id.strip(): + raise ValueError("indexes binding id must be non-blank") + return self + + @property + def binding_id(self) -> str: + """Return the single logical binding identifier configured for v1.""" + return next(iter(self.indexes)) + + @property + def binding(self) -> MCPIndexBindingConfig: + """Return the sole configured binding.""" + return self.indexes[self.binding_id] + + @property + def runtime(self) -> MCPRuntimeConfig: + """Expose the sole binding's runtime config for phase 1.""" + return self.binding.runtime + + @property + def vectorizer(self) -> MCPVectorizerConfig: + """Expose the sole binding's vectorizer config for phase 1.""" + return self.binding.vectorizer + + @property + def search(self) -> MCPIndexSearchConfig: + """Expose the sole binding's configured search behavior.""" + return self.binding.search + + @property + def redis_name(self) -> str: + """Return the existing Redis index name that must be inspected at startup.""" + return self.binding.redis_name + + def inspected_schema_from_index_info( + self, index_info: Dict[str, Any] + ) -> Dict[str, Any]: + """Build a schema dict from FT.INFO while preserving discovered field identity. + + RedisVL's generic FT.INFO conversion omits vector fields when their attrs are + incomplete on older Redis versions. MCP needs those field identities to survive + so schema overrides can patch the missing attrs during startup. + """ + from redisvl.redis.connection import convert_index_info_to_schema + + schema_dict = convert_index_info_to_schema(index_info) + discovered_fields = { + field["name"]: field + for field in schema_dict.get("fields", []) + if isinstance(field, dict) and "name" in field + } + + storage_type = index_info["index_definition"][1].lower() + for raw_field in index_info.get("attributes", []): + name = raw_field[1] if storage_type == "hash" else raw_field[3] + if name in discovered_fields: + continue + + field = { + "name": name, + "type": str(raw_field[5]).lower(), + } + if storage_type == "json": + field["path"] = raw_field[1] + + # Keep discovered field identity even when FT.INFO omitted attrs. + schema_dict.setdefault("fields", []).append(field) + + return schema_dict + + def merge_schema_overrides( + self, inspected_schema: Dict[str, Any] + ) -> Dict[str, Any]: + """Apply validated schema overrides without allowing identity changes.""" + merged_schema = deepcopy(inspected_schema) + merged_fields = merged_schema.setdefault("fields", []) + discovered_fields = { + field["name"]: field + for field in merged_fields + if isinstance(field, dict) and "name" in field + } + + for override in self.binding.schema_overrides.fields: + discovered = discovered_fields.get(override.name) + if discovered is None: + raise ValueError( + f"schema_overrides.fields '{override.name}' not found in inspected schema" + ) + + discovered_type = str(discovered.get("type", "")).lower() + override_type = override.type.lower() + if discovered_type != override_type: + raise ValueError( + f"schema_overrides.fields '{override.name}' cannot change discovered field type" + ) + + discovered_path = discovered.get("path") + if override.path is not None and override.path != discovered_path: + raise ValueError( + f"schema_overrides.fields '{override.name}' cannot change discovered field path" + ) + + if override.attrs: + merged_attrs = dict(discovered.get("attrs", {})) + merged_attrs.update(override.attrs) + discovered["attrs"] = merged_attrs + + return merged_schema + + def validate_runtime_mapping(self, schema: IndexSchema) -> None: + """Ensure runtime mappings point at explicit fields in the effective schema.""" + field_names = set(schema.field_names) + + if self.runtime.text_field_name not in field_names: + raise ValueError( + f"runtime.text_field_name '{self.runtime.text_field_name}' not found in schema" + ) + + if self.runtime.default_embed_text_field not in field_names: + raise ValueError( + "runtime.default_embed_text_field " + f"'{self.runtime.default_embed_text_field}' not found in schema" + ) + + vector_field = schema.fields.get(self.runtime.vector_field_name) + if vector_field is None: + raise ValueError( + f"runtime.vector_field_name '{self.runtime.vector_field_name}' not found in schema" + ) + if vector_field.type != "vector": + raise ValueError( + f"runtime.vector_field_name '{self.runtime.vector_field_name}' must reference a vector field" + ) + + def to_index_schema(self, inspected_schema: Dict[str, Any]) -> IndexSchema: + """Apply overrides to an inspected schema and validate the effective result.""" + merged_schema = self.merge_schema_overrides(inspected_schema) + schema = IndexSchema.model_validate(merged_schema) + self.validate_runtime_mapping(schema) + return schema + + def get_vector_field(self, schema: IndexSchema) -> BaseField: + """Return the effective vector field from a validated schema.""" + return schema.fields[self.runtime.vector_field_name] + + def get_vector_field_dims(self, schema: IndexSchema) -> Optional[int]: + """Return the effective vector dimensions when the field exposes them.""" + attrs = self.get_vector_field(schema).attrs + return getattr(attrs, "dims", None) + + def validate_search( + self, + *, + supports_native_hybrid_search: bool, + ) -> None: + """Validate configured search behavior against current runtime support.""" + self.search.validate_runtime_capabilities( + supports_native_hybrid_search=supports_native_hybrid_search + ) + + +def _substitute_env(value: Any) -> Any: + """Recursively resolve `${VAR}` and `${VAR:-default}` placeholders.""" + if isinstance(value, dict): + return {key: _substitute_env(item) for key, item in value.items()} + if isinstance(value, list): + return [_substitute_env(item) for item in value] + if not isinstance(value, str): + return value + + def replace(match: re.Match[str]) -> str: + name = match.group(1) + default = match.group(2) + env_value = os.environ.get(name) + if env_value is not None: + return env_value + if default is not None: + return default + raise ValueError(f"Missing required environment variable: {name}") + + return _ENV_PATTERN.sub(replace, value) + + +def load_mcp_config(path: str) -> MCPConfig: + """Load, substitute, and validate the MCP YAML configuration file.""" + config_path = Path(path).expanduser() + if not config_path.exists(): + raise FileNotFoundError(f"MCP config file {path} does not exist") + + try: + with config_path.open("r", encoding="utf-8") as file: + raw_data = yaml.safe_load(file) + except yaml.YAMLError as exc: + raise ValueError(f"Invalid MCP config YAML: {exc}") from exc + + if not isinstance(raw_data, dict): + raise ValueError("Invalid MCP config YAML: root document must be a mapping") + + substituted = _substitute_env(raw_data) + return MCPConfig.model_validate(substituted) diff --git a/redisvl/mcp/errors.py b/redisvl/mcp/errors.py new file mode 100644 index 00000000..6befad3b --- /dev/null +++ b/redisvl/mcp/errors.py @@ -0,0 +1,70 @@ +import asyncio +from enum import Enum +from typing import Any, Dict, Optional + +from pydantic import ValidationError +from redis.exceptions import RedisError + +from redisvl.exceptions import RedisSearchError + + +class MCPErrorCode(str, Enum): + """Stable internal error codes exposed by the MCP framework.""" + + INVALID_REQUEST = "invalid_request" + INVALID_FILTER = "invalid_filter" + DEPENDENCY_MISSING = "dependency_missing" + BACKEND_UNAVAILABLE = "backend_unavailable" + INTERNAL_ERROR = "internal_error" + + +class RedisVLMCPError(Exception): + """Framework-facing exception carrying a stable MCP error contract.""" + + def __init__( + self, + message: str, + *, + code: MCPErrorCode, + retryable: bool, + metadata: Optional[Dict[str, Any]] = None, + ) -> None: + super().__init__(message) + self.code = code + self.retryable = retryable + self.metadata = metadata or {} + + +def map_exception(exc: Exception) -> RedisVLMCPError: + """Map framework exceptions into deterministic MCP-facing exceptions.""" + if isinstance(exc, RedisVLMCPError): + return exc + + if isinstance(exc, (ValidationError, ValueError, FileNotFoundError)): + return RedisVLMCPError( + str(exc), + code=MCPErrorCode.INVALID_REQUEST, + retryable=False, + ) + + if isinstance(exc, ImportError): + return RedisVLMCPError( + str(exc), + code=MCPErrorCode.DEPENDENCY_MISSING, + retryable=False, + ) + + if isinstance( + exc, (TimeoutError, asyncio.TimeoutError, RedisSearchError, RedisError) + ): + return RedisVLMCPError( + str(exc), + code=MCPErrorCode.BACKEND_UNAVAILABLE, + retryable=True, + ) + + return RedisVLMCPError( + str(exc), + code=MCPErrorCode.INTERNAL_ERROR, + retryable=False, + ) diff --git a/redisvl/mcp/filters.py b/redisvl/mcp/filters.py new file mode 100644 index 00000000..af5eef3f --- /dev/null +++ b/redisvl/mcp/filters.py @@ -0,0 +1,236 @@ +from typing import Any, Dict, Iterable, List, Optional, Union + +from redisvl.mcp.errors import MCPErrorCode, RedisVLMCPError +from redisvl.query.filter import FilterExpression, Num, Tag, Text +from redisvl.schema import IndexSchema + + +def parse_filter( + value: Optional[Union[str, Dict[str, Any]]], schema: IndexSchema +) -> Optional[Union[str, FilterExpression]]: + """Parse an MCP filter value into a RedisVL filter representation.""" + if value is None: + return None + if isinstance(value, str): + return value + if not isinstance(value, dict): + raise RedisVLMCPError( + "filter must be a string or object", + code=MCPErrorCode.INVALID_FILTER, + retryable=False, + ) + return _parse_expression(value, schema) + + +def _parse_expression(value: Dict[str, Any], schema: IndexSchema) -> FilterExpression: + logical_keys = [key for key in ("and", "or", "not") if key in value] + if logical_keys: + if len(logical_keys) != 1 or len(value) != 1: + raise RedisVLMCPError( + "logical filter objects must contain exactly one operator", + code=MCPErrorCode.INVALID_FILTER, + retryable=False, + ) + + logical_key = logical_keys[0] + if logical_key == "not": + child = value["not"] + if not isinstance(child, dict): + raise RedisVLMCPError( + "not filter must wrap a single object expression", + code=MCPErrorCode.INVALID_FILTER, + retryable=False, + ) + return FilterExpression(f"(-({str(_parse_expression(child, schema))}))") + + children = value[logical_key] + if not isinstance(children, list) or not children: + raise RedisVLMCPError( + f"{logical_key} filter must contain a non-empty array", + code=MCPErrorCode.INVALID_FILTER, + retryable=False, + ) + + expressions: List[FilterExpression] = [] + for child in children: + if not isinstance(child, dict): + raise RedisVLMCPError( + "logical filter children must be objects", + code=MCPErrorCode.INVALID_FILTER, + retryable=False, + ) + expressions.append(_parse_expression(child, schema)) + + combined = expressions[0] + for child in expressions[1:]: + combined = combined & child if logical_key == "and" else combined | child + return combined + + field_name = value.get("field") + op = value.get("op") + if not isinstance(field_name, str) or not field_name.strip(): + raise RedisVLMCPError( + "filter.field must be a non-empty string", + code=MCPErrorCode.INVALID_FILTER, + retryable=False, + ) + if not isinstance(op, str) or not op.strip(): + raise RedisVLMCPError( + "filter.op must be a non-empty string", + code=MCPErrorCode.INVALID_FILTER, + retryable=False, + ) + + field = schema.fields.get(field_name) + if field is None: + raise RedisVLMCPError( + f"Unknown filter field: {field_name}", + code=MCPErrorCode.INVALID_FILTER, + retryable=False, + ) + + normalized_op = op.lower() + if normalized_op == "exists": + return FilterExpression(f"(-ismissing(@{field_name}))") + + if "value" not in value: + raise RedisVLMCPError( + "filter.value is required for this operator", + code=MCPErrorCode.INVALID_FILTER, + retryable=False, + ) + + operand = value["value"] + if field.type == "tag": + return _parse_tag_expression(field_name, normalized_op, operand) + if field.type == "text": + return _parse_text_expression(field_name, normalized_op, operand) + if field.type == "numeric": + return _parse_numeric_expression(field_name, normalized_op, operand) + + raise RedisVLMCPError( + f"Unsupported filter field type for {field_name}: {field.type}", + code=MCPErrorCode.INVALID_FILTER, + retryable=False, + ) + + +def _parse_tag_expression(field_name: str, op: str, operand: Any) -> FilterExpression: + field = Tag(field_name) + if op == "eq": + return field == _require_string(operand, field_name, op) + if op == "ne": + return field != _require_string(operand, field_name, op) + if op == "in": + return field == _require_string_list(operand, field_name, op) + if op == "like": + return field % _require_string(operand, field_name, op) + raise RedisVLMCPError( + f"Unsupported operator '{op}' for tag field '{field_name}'", + code=MCPErrorCode.INVALID_FILTER, + retryable=False, + ) + + +def _parse_text_expression(field_name: str, op: str, operand: Any) -> FilterExpression: + field = Text(field_name) + if op == "eq": + return field == _require_string(operand, field_name, op) + if op == "ne": + return field != _require_string(operand, field_name, op) + if op == "like": + return field % _require_string(operand, field_name, op) + if op == "in": + return _combine_or( + [field == item for item in _require_string_list(operand, field_name, op)] + ) + raise RedisVLMCPError( + f"Unsupported operator '{op}' for text field '{field_name}'", + code=MCPErrorCode.INVALID_FILTER, + retryable=False, + ) + + +def _parse_numeric_expression( + field_name: str, op: str, operand: Any +) -> FilterExpression: + field = Num(field_name) + if op == "eq": + return field == _require_number(operand, field_name, op) + if op == "ne": + return field != _require_number(operand, field_name, op) + if op == "gt": + return field > _require_number(operand, field_name, op) + if op == "gte": + return field >= _require_number(operand, field_name, op) + if op == "lt": + return field < _require_number(operand, field_name, op) + if op == "lte": + return field <= _require_number(operand, field_name, op) + if op == "in": + return _combine_or( + [field == item for item in _require_number_list(operand, field_name, op)] + ) + raise RedisVLMCPError( + f"Unsupported operator '{op}' for numeric field '{field_name}'", + code=MCPErrorCode.INVALID_FILTER, + retryable=False, + ) + + +def _combine_or(expressions: Iterable[FilterExpression]) -> FilterExpression: + expression_list = list(expressions) + if not expression_list: + raise RedisVLMCPError( + "in operator requires a non-empty array", + code=MCPErrorCode.INVALID_FILTER, + retryable=False, + ) + + combined = expression_list[0] + for expression in expression_list[1:]: + combined = combined | expression + return combined + + +def _require_string(value: Any, field_name: str, op: str) -> str: + if not isinstance(value, str) or not value: + raise RedisVLMCPError( + f"filter value for field '{field_name}' and operator '{op}' must be a non-empty string", + code=MCPErrorCode.INVALID_FILTER, + retryable=False, + ) + return value + + +def _require_string_list(value: Any, field_name: str, op: str) -> List[str]: + if not isinstance(value, list) or not value: + raise RedisVLMCPError( + f"filter value for field '{field_name}' and operator '{op}' must be a non-empty array", + code=MCPErrorCode.INVALID_FILTER, + retryable=False, + ) + strings = [_require_string(item, field_name, op) for item in value] + return strings + + +def _require_number(value: Any, field_name: str, op: str) -> Union[int, float]: + if isinstance(value, bool) or not isinstance(value, (int, float)): + raise RedisVLMCPError( + f"filter value for field '{field_name}' and operator '{op}' must be numeric", + code=MCPErrorCode.INVALID_FILTER, + retryable=False, + ) + return value + + +def _require_number_list( + value: Any, field_name: str, op: str +) -> List[Union[int, float]]: + if not isinstance(value, list) or not value: + raise RedisVLMCPError( + f"filter value for field '{field_name}' and operator '{op}' must be a non-empty array", + code=MCPErrorCode.INVALID_FILTER, + retryable=False, + ) + return [_require_number(item, field_name, op) for item in value] diff --git a/redisvl/mcp/server.py b/redisvl/mcp/server.py new file mode 100644 index 00000000..a2fb2bd6 --- /dev/null +++ b/redisvl/mcp/server.py @@ -0,0 +1,195 @@ +import asyncio +from importlib import import_module +from typing import Any, Awaitable, Optional, Type + +from redis import __version__ as redis_py_version + +from redisvl.exceptions import RedisSearchError +from redisvl.index import AsyncSearchIndex +from redisvl.mcp.config import MCPConfig, load_mcp_config +from redisvl.mcp.settings import MCPSettings +from redisvl.mcp.tools.search import register_search_tool +from redisvl.mcp.tools.upsert import register_upsert_tool +from redisvl.redis.connection import RedisConnectionFactory, is_version_gte +from redisvl.schema import IndexSchema + +try: + from fastmcp import FastMCP +except ImportError: + + class FastMCP: # type: ignore[no-redef] + """Import-safe stand-in used when the optional MCP SDK is unavailable.""" + + def __init__(self, *args, **kwargs): + self.args = args + self.kwargs = kwargs + + +def resolve_vectorizer_class(class_name: str) -> Type[Any]: + """Resolve a vectorizer class from the public RedisVL vectorizer module.""" + vectorize_module = import_module("redisvl.utils.vectorize") + try: + return getattr(vectorize_module, class_name) + except AttributeError as exc: + raise ValueError(f"Unknown vectorizer class: {class_name}") from exc + + +class RedisVLMCPServer(FastMCP): + """MCP server exposing RedisVL capabilities for one existing Redis index.""" + + def __init__(self, settings: MCPSettings): + """Create a server shell with lazy config, index, and vectorizer state.""" + super().__init__("redisvl") + self.mcp_settings = settings + self.config: Optional[MCPConfig] = None + self._index: Optional[AsyncSearchIndex] = None + self._vectorizer: Optional[Any] = None + self._semaphore: Optional[asyncio.Semaphore] = None + self._tools_registered = False + + async def startup(self) -> None: + """Load config, inspect the configured index, and initialize dependencies.""" + self.config = load_mcp_config(self.mcp_settings.config) + self._semaphore = asyncio.Semaphore(self.config.runtime.max_concurrency) + timeout = self.config.runtime.startup_timeout_seconds + client = None + + try: + client = await asyncio.wait_for( + RedisConnectionFactory._get_aredis_connection( + redis_url=self.config.server.redis_url + ), + timeout=timeout, + ) + await asyncio.wait_for(client.info("server"), timeout=timeout) + + try: + index_info = await asyncio.wait_for( + AsyncSearchIndex._info(self.config.redis_name, client), + timeout=timeout, + ) + except RedisSearchError as exc: + if self._is_missing_index_error(exc): + raise ValueError( + f"Configured Redis index '{self.config.redis_name}' does not exist" + ) from exc + raise + + inspected_schema = self.config.inspected_schema_from_index_info(index_info) + effective_schema = self.config.to_index_schema(inspected_schema) + self._index = AsyncSearchIndex(schema=effective_schema, redis_client=client) + # The server acquired this client explicitly during startup, so hand + # ownership to the index for a single shutdown path. + self._index._owns_redis_client = True + self.config.validate_search( + supports_native_hybrid_search=await self.supports_native_hybrid_search(), + ) + + self._vectorizer = await asyncio.wait_for( + asyncio.to_thread(self._build_vectorizer), + timeout=timeout, + ) + self._validate_vectorizer_dims(effective_schema) + self._register_tools() + except Exception: + if self._index is not None: + await self.shutdown() + elif client is not None: + await client.aclose() + raise + + async def shutdown(self) -> None: + """Release owned vectorizer and Redis resources.""" + vectorizer = self._vectorizer + self._vectorizer = None + try: + if vectorizer is not None: + aclose = getattr(vectorizer, "aclose", None) + close = getattr(vectorizer, "close", None) + if callable(aclose): + await aclose() + elif callable(close): + close() + finally: + if self._index is not None: + index = self._index + self._index = None + await index.disconnect() + + async def get_index(self) -> AsyncSearchIndex: + """Return the initialized async index or fail if startup has not run.""" + if self._index is None: + raise RuntimeError("MCP server has not been started") + return self._index + + async def get_vectorizer(self) -> Any: + """Return the initialized vectorizer or fail if startup has not run.""" + if self._vectorizer is None: + raise RuntimeError("MCP server has not been started") + return self._vectorizer + + async def run_guarded(self, operation_name: str, awaitable: Awaitable[Any]) -> Any: + """Run a coroutine under the configured concurrency and timeout limits.""" + del operation_name + if self.config is None or self._semaphore is None: + raise RuntimeError("MCP server has not been started") + + async with self._semaphore: + return await asyncio.wait_for( + awaitable, + timeout=self.config.runtime.request_timeout_seconds, + ) + + def _build_vectorizer(self) -> Any: + """Instantiate the configured vectorizer class from validated config.""" + if self.config is None: + raise RuntimeError("MCP server config not loaded") + + vectorizer_class = resolve_vectorizer_class(self.config.vectorizer.class_name) + return vectorizer_class(**self.config.vectorizer.to_init_kwargs()) + + def _validate_vectorizer_dims(self, schema: IndexSchema) -> None: + """Fail startup when vectorizer dimensions disagree with schema dimensions.""" + if self.config is None or self._vectorizer is None: + return + + configured_dims = self.config.get_vector_field_dims(schema) + actual_dims = getattr(self._vectorizer, "dims", None) + if ( + configured_dims is not None + and actual_dims is not None + and configured_dims != actual_dims + ): + raise ValueError( + f"Vectorizer dims {actual_dims} do not match configured vector field dims {configured_dims}" + ) + + async def supports_native_hybrid_search(self) -> bool: + """Return whether the current runtime supports Redis native hybrid search.""" + if self._index is None: + raise RuntimeError("MCP server has not been started") + if not is_version_gte(redis_py_version, "7.1.0"): + return False + + client = await self._index._get_client() + info = await client.info("server") + if not is_version_gte(info.get("redis_version", "0.0.0"), "8.4.0"): + return False + + return hasattr(client.ft(self._index.schema.index.name), "hybrid_search") + + def _register_tools(self) -> None: + """Register MCP tools once the server is ready.""" + if self._tools_registered or not hasattr(self, "tool"): + return + + register_search_tool(self) + if not self.mcp_settings.read_only: + register_upsert_tool(self) + self._tools_registered = True + + @staticmethod + def _is_missing_index_error(exc: RedisSearchError) -> bool: + """Detect the Redis search errors that mean the configured index is absent.""" + message = str(exc).lower() + return "unknown index name" in message or "no such index" in message diff --git a/redisvl/mcp/settings.py b/redisvl/mcp/settings.py new file mode 100644 index 00000000..14aca88d --- /dev/null +++ b/redisvl/mcp/settings.py @@ -0,0 +1,41 @@ +from typing import Any, Optional, cast + +from pydantic import Field +from pydantic_settings import BaseSettings, SettingsConfigDict + + +class MCPSettings(BaseSettings): + """Environment-backed settings for bootstrapping the MCP server.""" + + model_config = SettingsConfigDict( + env_prefix="REDISVL_MCP_", + extra="ignore", + ) + + config: str = Field(..., min_length=1) + read_only: bool = False + tool_search_description: Optional[str] = None + tool_upsert_description: Optional[str] = None + + @classmethod + def from_env( + cls, + *, + config: Optional[str] = None, + read_only: Optional[bool] = None, + tool_search_description: Optional[str] = None, + tool_upsert_description: Optional[str] = None, + ) -> "MCPSettings": + """Build settings from explicit overrides plus `REDISVL_MCP_*` env vars.""" + overrides: dict[str, object] = {} + if config is not None: + overrides["config"] = config + if read_only is not None: + overrides["read_only"] = read_only + if tool_search_description is not None: + overrides["tool_search_description"] = tool_search_description + if tool_upsert_description is not None: + overrides["tool_upsert_description"] = tool_upsert_description + + # `BaseSettings` fills any missing fields from the configured env prefix. + return cls(**cast(dict[str, Any], overrides)) diff --git a/redisvl/mcp/tools/__init__.py b/redisvl/mcp/tools/__init__.py new file mode 100644 index 00000000..40e0a59e --- /dev/null +++ b/redisvl/mcp/tools/__init__.py @@ -0,0 +1,4 @@ +from redisvl.mcp.tools.search import search_records +from redisvl.mcp.tools.upsert import upsert_records + +__all__ = ["search_records", "upsert_records"] diff --git a/redisvl/mcp/tools/search.py b/redisvl/mcp/tools/search.py new file mode 100644 index 00000000..ae59c783 --- /dev/null +++ b/redisvl/mcp/tools/search.py @@ -0,0 +1,388 @@ +import asyncio +import inspect +from typing import Any, Optional, Union + +from redisvl.mcp.errors import MCPErrorCode, RedisVLMCPError, map_exception +from redisvl.mcp.filters import parse_filter +from redisvl.query import AggregateHybridQuery, HybridQuery, TextQuery, VectorQuery + +DEFAULT_SEARCH_DESCRIPTION = "Search records in the configured Redis index." + +_NATIVE_HYBRID_DEFAULTS = { + "combination_method": "LINEAR", + "linear_text_weight": 0.3, +} + + +def _validate_request( + *, + query: str, + limit: Optional[int], + offset: int, + return_fields: Optional[list[str]], + server: Any, + index: Any, +) -> tuple[int, list[str]]: + """Validate a `search-records` request and resolve default projection. + + The MCP caller can only supply query text, pagination, filters, and return + fields. Search mode and tuning are sourced from config, so this validation + step focuses only on the public request contract. + """ + + runtime = server.config.runtime + + if not isinstance(query, str) or not query.strip(): + raise RedisVLMCPError( + "query must be a non-empty string", + code=MCPErrorCode.INVALID_REQUEST, + retryable=False, + ) + + effective_limit = runtime.default_limit if limit is None else limit + if not isinstance(effective_limit, int) or effective_limit <= 0: + raise RedisVLMCPError( + "limit must be greater than 0", + code=MCPErrorCode.INVALID_REQUEST, + retryable=False, + ) + if effective_limit > runtime.max_limit: + raise RedisVLMCPError( + f"limit must be less than or equal to {runtime.max_limit}", + code=MCPErrorCode.INVALID_REQUEST, + retryable=False, + ) + if not isinstance(offset, int) or offset < 0: + raise RedisVLMCPError( + "offset must be greater than or equal to 0", + code=MCPErrorCode.INVALID_REQUEST, + retryable=False, + ) + + schema_fields = set(index.schema.field_names) + vector_field_name = runtime.vector_field_name + + if return_fields is None: + fields = [ + field_name + for field_name in index.schema.field_names + if field_name != vector_field_name + ] + else: + if not isinstance(return_fields, list): + raise RedisVLMCPError( + "return_fields must be a list of field names", + code=MCPErrorCode.INVALID_REQUEST, + retryable=False, + ) + fields = [] + for field_name in return_fields: + if not isinstance(field_name, str) or not field_name: + raise RedisVLMCPError( + "return_fields must contain non-empty strings", + code=MCPErrorCode.INVALID_REQUEST, + retryable=False, + ) + if field_name not in schema_fields: + raise RedisVLMCPError( + f"Unknown return field '{field_name}'", + code=MCPErrorCode.INVALID_REQUEST, + retryable=False, + ) + if field_name == vector_field_name: + raise RedisVLMCPError( + f"Vector field '{vector_field_name}' cannot be returned", + code=MCPErrorCode.INVALID_REQUEST, + retryable=False, + ) + fields.append(field_name) + + return effective_limit, fields + + +def _normalize_record( + result: dict[str, Any], score_field: str, score_type: str +) -> dict[str, Any]: + """Convert one RedisVL result into the stable MCP result shape.""" + score = result.get(score_field) + if score is None and score_field == "score": + score = result.get("__score") + if score is None: + raise RedisVLMCPError( + f"Search result missing expected score field '{score_field}'", + code=MCPErrorCode.INVALID_REQUEST, + retryable=False, + ) + + record = dict(result) + doc_id = record.pop("id", None) + if doc_id is None: + doc_id = record.pop("__key", None) + if doc_id is None: + doc_id = record.pop("key", None) + if doc_id is None: + raise RedisVLMCPError( + "Search result missing id", + code=MCPErrorCode.INVALID_REQUEST, + retryable=False, + ) + + for field_name in ( + "vector_distance", + "score", + "__score", + "text_score", + "vector_similarity", + "hybrid_score", + ): + record.pop(field_name, None) + + return { + "id": doc_id, + "score": float(score), + "score_type": score_type, + "record": record, + } + + +async def _embed_query(vectorizer: Any, query: str) -> Any: + """Embed the query text, tolerating vectorizers without real async support.""" + aembed = getattr(vectorizer, "aembed", None) + if callable(aembed): + try: + return await aembed(query) + except NotImplementedError: + pass + embed = getattr(vectorizer, "embed") + if inspect.iscoroutinefunction(embed): + return await embed(query) + return await asyncio.to_thread(embed, query) + + +def _get_configured_search(server: Any) -> tuple[str, dict[str, Any]]: + """Return the configured search mode and normalized query params.""" + search_config = server.config.search + return search_config.type, search_config.to_query_params() + + +def _build_native_hybrid_kwargs( + *, + query: str, + embedding: Any, + runtime: Any, + filter_expression: Any, + return_fields: list[str], + num_results: int, + search_params: dict[str, Any], +) -> dict[str, Any]: + """Build native `HybridQuery` kwargs from MCP config-owned hybrid params.""" + params = {**_NATIVE_HYBRID_DEFAULTS, **search_params} + linear_text_weight = params.pop("linear_text_weight", None) + if linear_text_weight is not None: + params["linear_alpha"] = linear_text_weight + + return { + "text": query, + "text_field_name": runtime.text_field_name, + "vector": embedding, + "vector_field_name": runtime.vector_field_name, + "filter_expression": filter_expression, + "return_fields": ["__key", *return_fields], + "num_results": num_results, + "yield_text_score_as": "text_score", + "yield_vsim_score_as": "vector_similarity", + "yield_combined_score_as": "hybrid_score", + **params, + } + + +def _build_fallback_hybrid_kwargs( + *, + query: str, + embedding: Any, + runtime: Any, + filter_expression: Any, + return_fields: list[str], + num_results: int, + search_params: dict[str, Any], +) -> dict[str, Any]: + """Build aggregate fallback kwargs while preserving MCP fusion semantics.""" + params = { + key: value + for key, value in search_params.items() + if key in {"text_scorer", "stopwords", "text_weights"} + } + linear_text_weight = search_params.get("linear_text_weight", 0.3) + params["alpha"] = 1 - linear_text_weight + + return { + "text": query, + "text_field_name": runtime.text_field_name, + "vector": embedding, + "vector_field_name": runtime.vector_field_name, + "filter_expression": filter_expression, + "return_fields": ["__key", *return_fields], + "num_results": num_results, + **params, + } + + +async def _build_query( + *, + server: Any, + index: Any, + query: str, + limit: int, + offset: int, + filter_value: Optional[Union[str, dict[str, Any]]], + return_fields: list[str], +) -> tuple[Any, str, str, str]: + """Build the RedisVL query object from configured search mode and params. + + Returns the query instance, the raw score field to read from RedisVL + results, the public MCP `score_type`, and the configured `search_type`. + """ + runtime = server.config.runtime + search_type, search_params = _get_configured_search(server) + num_results = limit + offset + filter_expression = parse_filter(filter_value, index.schema) + + if search_type == "vector": + vectorizer = await server.get_vectorizer() + embedding = await _embed_query(vectorizer, query) + vector_kwargs = { + "vector": embedding, + "vector_field_name": runtime.vector_field_name, + "filter_expression": filter_expression, + "return_fields": return_fields, + "num_results": num_results, + **search_params, + } + if "normalize_vector_distance" not in vector_kwargs: + vector_kwargs["normalize_vector_distance"] = True + return ( + VectorQuery(**vector_kwargs), + "vector_distance", + "vector_distance_normalized", + search_type, + ) + + if search_type == "fulltext": + return ( + TextQuery( + text=query, + text_field_name=runtime.text_field_name, + filter_expression=filter_expression, + return_fields=return_fields, + num_results=num_results, + **search_params, + ), + "score", + "text_score", + search_type, + ) + + vectorizer = await server.get_vectorizer() + embedding = await _embed_query(vectorizer, query) + if await server.supports_native_hybrid_search(): + native_query = HybridQuery( + **_build_native_hybrid_kwargs( + query=query, + embedding=embedding, + runtime=runtime, + filter_expression=filter_expression, + return_fields=return_fields, + num_results=num_results, + search_params=search_params, + ) + ) + native_query.postprocessing_config.apply(__key="@__key") + return native_query, "hybrid_score", "hybrid_score", search_type + + fallback_query = AggregateHybridQuery( + **_build_fallback_hybrid_kwargs( + query=query, + embedding=embedding, + runtime=runtime, + filter_expression=filter_expression, + return_fields=return_fields, + num_results=num_results, + search_params=search_params, + ) + ) + return fallback_query, "hybrid_score", "hybrid_score", search_type + + +async def search_records( + server: Any, + *, + query: str, + limit: Optional[int] = None, + offset: int = 0, + filter: Optional[Union[str, dict[str, Any]]] = None, + return_fields: Optional[list[str]] = None, +) -> dict[str, Any]: + """Execute `search-records` against the configured Redis index binding.""" + try: + index = await server.get_index() + effective_limit, effective_return_fields = _validate_request( + query=query, + limit=limit, + offset=offset, + return_fields=return_fields, + server=server, + index=index, + ) + built_query, score_field, score_type, search_type = await _build_query( + server=server, + index=index, + query=query.strip(), + limit=effective_limit, + offset=offset, + filter_value=filter, + return_fields=effective_return_fields, + ) + raw_results = await server.run_guarded( + "search-records", + index.query(built_query), + ) + sliced_results = raw_results[offset : offset + effective_limit] + return { + "search_type": search_type, + "offset": offset, + "limit": effective_limit, + "results": [ + _normalize_record(result, score_field, score_type) + for result in sliced_results + ], + } + except RedisVLMCPError: + raise + except Exception as exc: + raise map_exception(exc) from exc + + +def register_search_tool(server: Any) -> None: + """Register the MCP `search-records` tool with its config-owned contract.""" + description = ( + server.mcp_settings.tool_search_description or DEFAULT_SEARCH_DESCRIPTION + ) + + async def search_records_tool( + query: str, + limit: Optional[int] = None, + offset: int = 0, + filter: Optional[Union[str, dict[str, Any]]] = None, + return_fields: Optional[list[str]] = None, + ): + """FastMCP wrapper for the `search-records` tool.""" + return await search_records( + server, + query=query, + limit=limit, + offset=offset, + filter=filter, + return_fields=return_fields, + ) + + server.tool(name="search-records", description=description)(search_records_tool) diff --git a/redisvl/mcp/tools/upsert.py b/redisvl/mcp/tools/upsert.py new file mode 100644 index 00000000..61a18883 --- /dev/null +++ b/redisvl/mcp/tools/upsert.py @@ -0,0 +1,309 @@ +import asyncio +import inspect +from typing import Any, Dict, List, Optional + +from redisvl.mcp.errors import MCPErrorCode, RedisVLMCPError, map_exception +from redisvl.redis.utils import array_to_buffer +from redisvl.schema.schema import StorageType +from redisvl.schema.validation import validate_object + +DEFAULT_UPSERT_DESCRIPTION = "Upsert records in the configured Redis index." + + +def _validate_request( + *, + server: Any, + records: List[Dict[str, Any]], + id_field: Optional[str], + skip_embedding_if_present: Optional[bool], +) -> bool: + """Validate the public upsert request contract and resolve defaults.""" + runtime = server.config.runtime + + if not isinstance(records, list) or not records: + raise RedisVLMCPError( + "records must be a non-empty list", + code=MCPErrorCode.INVALID_REQUEST, + retryable=False, + ) + if len(records) > runtime.max_upsert_records: + raise RedisVLMCPError( + "records length must be less than or equal to " + f"{runtime.max_upsert_records}", + code=MCPErrorCode.INVALID_REQUEST, + retryable=False, + ) + if id_field is not None and (not isinstance(id_field, str) or not id_field): + raise RedisVLMCPError( + "id_field must be a non-empty string when provided", + code=MCPErrorCode.INVALID_REQUEST, + retryable=False, + ) + + effective_skip_embedding = runtime.skip_embedding_if_present + if skip_embedding_if_present is not None: + if not isinstance(skip_embedding_if_present, bool): + raise RedisVLMCPError( + "skip_embedding_if_present must be a boolean when provided", + code=MCPErrorCode.INVALID_REQUEST, + retryable=False, + ) + effective_skip_embedding = skip_embedding_if_present + + for record in records: + if not isinstance(record, dict): + raise RedisVLMCPError( + "records must contain only objects", + code=MCPErrorCode.INVALID_REQUEST, + retryable=False, + ) + if id_field is not None and id_field not in record: + raise RedisVLMCPError( + "id_field '{id_field}' must exist in every record".format( + id_field=id_field + ), + code=MCPErrorCode.INVALID_REQUEST, + retryable=False, + ) + + return effective_skip_embedding + + +def _record_needs_embedding( + record: Dict[str, Any], + *, + vector_field_name: str, + skip_embedding_if_present: bool, +) -> bool: + """Determine whether a record requires server-side embedding.""" + return ( + not skip_embedding_if_present + or vector_field_name not in record + or record[vector_field_name] is None + ) + + +def _validate_embed_sources( + records: List[Dict[str, Any]], + *, + embed_text_field: str, + vector_field_name: str, + skip_embedding_if_present: bool, +) -> List[str]: + """Collect embed sources for records that require embedding.""" + contents = [] + for record in records: + if not _record_needs_embedding( + record, + vector_field_name=vector_field_name, + skip_embedding_if_present=skip_embedding_if_present, + ): + continue + + content = record.get(embed_text_field) + if not isinstance(content, str) or not content.strip(): + raise RedisVLMCPError( + "records requiring embedding must include a non-empty " + "'{field}' field".format(field=embed_text_field), + code=MCPErrorCode.INVALID_REQUEST, + retryable=False, + ) + contents.append(content) + + return contents + + +async def _embed_one(vectorizer: Any, content: str) -> List[float]: + """Embed one record, falling back from async to sync implementations.""" + aembed = getattr(vectorizer, "aembed", None) + if callable(aembed): + try: + return await aembed(content) + except NotImplementedError: + pass + + embed = getattr(vectorizer, "embed", None) + if embed is None: + raise AttributeError("Configured vectorizer does not support embed()") + if inspect.iscoroutinefunction(embed): + return await embed(content) + return await asyncio.to_thread(embed, content) + + +async def _embed_many(vectorizer: Any, contents: List[str]) -> List[List[float]]: + """Embed multiple records with batch-first fallbacks.""" + if not contents: + return [] + + aembed_many = getattr(vectorizer, "aembed_many", None) + if callable(aembed_many): + try: + return await aembed_many(contents) + except NotImplementedError: + pass + + embed_many = getattr(vectorizer, "embed_many", None) + if callable(embed_many): + if inspect.iscoroutinefunction(embed_many): + return await embed_many(contents) + return await asyncio.to_thread(embed_many, contents) + + embeddings = [] + for content in contents: + embeddings.append(await _embed_one(vectorizer, content)) + return embeddings + + +def _vector_dtype(server: Any, index: Any) -> str: + """Resolve the configured vector field datatype as a lowercase string.""" + field = server.config.get_vector_field(index.schema) + datatype = getattr(field.attrs.datatype, "value", field.attrs.datatype) + return str(datatype).lower() + + +def _validation_schema_for_record( + index: Any, + *, + vector_field_name: str, + record: Dict[str, Any], +) -> Any: + """Use a JSON-shaped schema when validating list vectors for HASH storage.""" + if index.schema.index.storage_type == StorageType.HASH and isinstance( + record.get(vector_field_name), list + ): + schema = index.schema.model_copy(deep=True) + schema.index.storage_type = StorageType.JSON + return schema + return index.schema + + +def _validate_record( + record: Dict[str, Any], *, index: Any, vector_field_name: str +) -> None: + """Validate one record against the schema, allowing HASH list vectors.""" + validate_object( + _validation_schema_for_record( + index, + vector_field_name=vector_field_name, + record=record, + ), + record, + ) + + +def _prepare_record_for_storage( + record: Dict[str, Any], + *, + server: Any, + index: Any, +) -> Dict[str, Any]: + """Validate records before serializing HASH vectors for storage.""" + prepared = dict(record) + vector_field_name = server.config.runtime.vector_field_name + _validate_record(prepared, index=index, vector_field_name=vector_field_name) + + vector_value = prepared.get(vector_field_name) + + if index.schema.index.storage_type == StorageType.HASH: + if isinstance(vector_value, list): + prepared[vector_field_name] = array_to_buffer( + vector_value, + _vector_dtype(server, index), + ) + return prepared + + +async def upsert_records( + server: Any, + *, + records: List[Dict[str, Any]], + id_field: Optional[str] = None, + skip_embedding_if_present: Optional[bool] = None, +) -> Dict[str, Any]: + """Execute `upsert-records` against the configured Redis index.""" + try: + index = await server.get_index() + effective_skip_embedding = _validate_request( + server=server, + records=records, + id_field=id_field, + skip_embedding_if_present=skip_embedding_if_present, + ) + # Copy caller-provided records before enriching them with embeddings or + # storage-specific serialization so the MCP tool does not mutate inputs. + prepared_records = [record.copy() for record in records] + runtime = server.config.runtime + for record in prepared_records: + _validate_record( + record, + index=index, + vector_field_name=runtime.vector_field_name, + ) + embed_contents = _validate_embed_sources( + prepared_records, + embed_text_field=runtime.default_embed_text_field, + vector_field_name=runtime.vector_field_name, + skip_embedding_if_present=effective_skip_embedding, + ) + + if embed_contents: + vectorizer = await server.get_vectorizer() + embeddings = await _embed_many(vectorizer, embed_contents) + # Tracks position in the compact embeddings list, which only contains + # vectors for records that still need server-side embedding. + embedding_index = 0 + for record in prepared_records: + if _record_needs_embedding( + record, + vector_field_name=runtime.vector_field_name, + skip_embedding_if_present=effective_skip_embedding, + ): + record[runtime.vector_field_name] = embeddings[embedding_index] + embedding_index += 1 + + loadable_records = [ + _prepare_record_for_storage(record, server=server, index=index) + for record in prepared_records + ] + + try: + keys = await server.run_guarded( + "upsert-records", + index.load(loadable_records, id_field=id_field), + ) + except Exception as exc: + mapped = map_exception(exc) + mapped.metadata["partial_write_possible"] = True + raise mapped + + return { + "status": "success", + "keys_upserted": len(keys), + "keys": keys, + } + except RedisVLMCPError: + raise + except Exception as exc: + raise map_exception(exc) + + +def register_upsert_tool(server: Any) -> None: + """Register the MCP upsert tool on a server-like object.""" + description = ( + server.mcp_settings.tool_upsert_description or DEFAULT_UPSERT_DESCRIPTION + ) + + async def upsert_records_tool( + records: List[Dict[str, Any]], + id_field: Optional[str] = None, + skip_embedding_if_present: Optional[bool] = None, + ): + """FastMCP wrapper for the `upsert-records` tool.""" + return await upsert_records( + server, + records=records, + id_field=id_field, + skip_embedding_if_present=skip_embedding_if_present, + ) + + server.tool(name="upsert-records", description=description)(upsert_records_tool) diff --git a/redisvl/query/filter.py b/redisvl/query/filter.py index 0295568f..f30870f2 100644 --- a/redisvl/query/filter.py +++ b/redisvl/query/filter.py @@ -164,7 +164,7 @@ def __eq__(self, other: Union[List[str], str]) -> "FilterExpression": return FilterExpression(str(self)) @check_operator_misuse - def __ne__(self, other) -> "FilterExpression": + def __ne__(self, other: Union[List[str], str]) -> "FilterExpression": """Create a Tag inequality filter expression. Args: @@ -298,7 +298,7 @@ def __eq__(self, other) -> "FilterExpression": return FilterExpression(str(self)) @check_operator_misuse - def __ne__(self, other) -> "FilterExpression": + def __ne__(self, other: GeoRadius) -> "FilterExpression": """Create a geographic filter outside of a specified GeoRadius. Args: @@ -349,11 +349,11 @@ class Num(FilterField): SUPPORTED_VAL_TYPES = (int, float, tuple, type(None)) - def __eq__(self, other: int) -> "FilterExpression": + def __eq__(self, other: Union[int, float]) -> "FilterExpression": """Create a Numeric equality filter expression. Args: - other (int): The value to filter on. + other (Union[int, float]): The value to filter on. .. code-block:: python @@ -364,11 +364,11 @@ def __eq__(self, other: int) -> "FilterExpression": self._set_value(other, self.SUPPORTED_VAL_TYPES, FilterOperator.EQ) return FilterExpression(str(self)) - def __ne__(self, other: int) -> "FilterExpression": + def __ne__(self, other: Union[int, float]) -> "FilterExpression": """Create a Numeric inequality filter expression. Args: - other (int): The value to filter on. + other (Union[int, float]): The value to filter on. .. code-block:: python @@ -380,11 +380,11 @@ def __ne__(self, other: int) -> "FilterExpression": self._set_value(other, self.SUPPORTED_VAL_TYPES, FilterOperator.NE) return FilterExpression(str(self)) - def __gt__(self, other: int) -> "FilterExpression": + def __gt__(self, other: Union[int, float]) -> "FilterExpression": """Create a Numeric greater than filter expression. Args: - other (int): The value to filter on. + other (Union[int, float]): The value to filter on. .. code-block:: python @@ -396,11 +396,11 @@ def __gt__(self, other: int) -> "FilterExpression": self._set_value(other, self.SUPPORTED_VAL_TYPES, FilterOperator.GT) return FilterExpression(str(self)) - def __lt__(self, other: int) -> "FilterExpression": + def __lt__(self, other: Union[int, float]) -> "FilterExpression": """Create a Numeric less than filter expression. Args: - other (int): The value to filter on. + other (Union[int, float]): The value to filter on. .. code-block:: python @@ -412,11 +412,11 @@ def __lt__(self, other: int) -> "FilterExpression": self._set_value(other, self.SUPPORTED_VAL_TYPES, FilterOperator.LT) return FilterExpression(str(self)) - def __ge__(self, other: int) -> "FilterExpression": + def __ge__(self, other: Union[int, float]) -> "FilterExpression": """Create a Numeric greater than or equal to filter expression. Args: - other (int): The value to filter on. + other (Union[int, float]): The value to filter on. .. code-block:: python @@ -428,11 +428,11 @@ def __ge__(self, other: int) -> "FilterExpression": self._set_value(other, self.SUPPORTED_VAL_TYPES, FilterOperator.GE) return FilterExpression(str(self)) - def __le__(self, other: int) -> "FilterExpression": + def __le__(self, other: Union[int, float]) -> "FilterExpression": """Create a Numeric less than or equal to filter expression. Args: - other (int): The value to filter on. + other (Union[int, float]): The value to filter on. .. code-block:: python @@ -759,7 +759,9 @@ def _convert_to_timestamp(self, value, end_date=False): raise TypeError(f"Unsupported type for timestamp conversion: {type(value)}") - def __eq__(self, other) -> FilterExpression: + def __eq__( + self, other: Union[datetime.datetime, datetime.date, str, int, float] + ) -> FilterExpression: """ Filter for timestamps equal to the specified value. For date objects (without time), this matches the entire day. @@ -774,6 +776,7 @@ def __eq__(self, other) -> FilterExpression: # For date objects, match the entire day if isinstance(other, str): other = datetime.datetime.strptime(other, "%Y-%m-%d").date() + assert isinstance(other, datetime.date) # validate for mypy start = datetime.datetime.combine(other, datetime.time.min).astimezone( datetime.timezone.utc ) @@ -786,7 +789,9 @@ def __eq__(self, other) -> FilterExpression: self._set_value(timestamp, self.SUPPORTED_TYPES, FilterOperator.EQ) return FilterExpression(str(self)) - def __ne__(self, other) -> FilterExpression: + def __ne__( + self, other: Union[datetime.datetime, datetime.date, str, int, float] + ) -> FilterExpression: """ Filter for timestamps not equal to the specified value. For date objects (without time), this excludes the entire day. @@ -801,6 +806,7 @@ def __ne__(self, other) -> FilterExpression: # For date objects, exclude the entire day if isinstance(other, str): other = datetime.datetime.strptime(other, "%Y-%m-%d").date() + assert isinstance(other, datetime.date) # validate for mypy start = datetime.datetime.combine(other, datetime.time.min) end = datetime.datetime.combine(other, datetime.time.max) return self.between(start, end) diff --git a/spec/MCP-production-example.md b/spec/MCP-production-example.md new file mode 100644 index 00000000..16e0739b --- /dev/null +++ b/spec/MCP-production-example.md @@ -0,0 +1,194 @@ +--- +name: redisvl-mcp-production-example +description: Companion production-oriented example for the RedisVL MCP server specification. +metadata: + status: draft + audience: RedisVL maintainers and reviewers + objective: Provide a concrete, production-evaluable usage narrative without bloating the normative MCP specification. +--- + +# RedisVL MCP Production Example + +This document is a companion to [MCP.md](./MCP.md). It is intentionally narrative and example-driven. The normative server contract lives in the main spec. + +## Why This Example Exists + +The MCP specification is easier to evaluate when grounded in a realistic deployment. This example uses a Redis Enterprise customer because that is a strong production reference point, but the same RedisVL MCP design is intended to work with Redis Cloud and open-source Redis Stack instances, including local Docker deployments, provided the required index and Search capabilities already exist. + +## User Story + +As a platform team at a company running Redis Enterprise for internal knowledge retrieval, we want to expose our existing Redis vector indexes through MCP so internal AI assistants can perform low-latency, metadata-filtered search over approved enterprise content without copying data into another vector store or hand-recreating index schemas. + +## Scenario + +An enterprise platform team already operates a Redis-backed knowledge index called `internal_knowledge`. The index contains: + +- operational runbooks +- support knowledge base articles +- release notes +- incident summaries + +The team has already standardized on Redis as the serving layer for retrieval. Multiple internal assistants need access to the same retrieval surface: + +- an engineering support copilot in Slack +- a developer portal assistant +- an incident review assistant + +The platform team does not want each assistant team to: + +- reimplement Redis query logic +- duplicate the index into a separate vector database +- manually re-describe the Redis schema in every client integration + +Instead, they publish one RedisVL MCP server configuration with one approved index binding in v1. The MCP server attaches to an existing index, inspects its schema at startup, and exposes a stable tool contract to AI clients. + +This is intentionally simplified for v1 review. In a larger deployment, the same content domains could reasonably be split across multiple Redis indexes, such as separate bindings for runbooks, support KB content, release notes, or incident history. That would create a future need for one MCP server to route across multiple configured index bindings while keeping a coherent tool surface for clients. + +## Why MCP Helps + +MCP gives the platform team a standard tool boundary: + +- AI clients can use the same `search-records` contract. +- The Redis index stays the source of truth for field definitions and search behavior. +- The vectorizer remains explicit and reviewable, which matters when embedding model choice is governed separately from index operations. +- Metadata filters remain available to enforce application-level narrowing such as team, region, product, and severity. +- The MCP surface can stay read-only for assistant clients, which avoids exposing direct write access to the internal knowledge index. + +## Deployment Sketch + +1. Redis already hosts the `internal_knowledge` index. +2. The platform team provisions a small stdio MCP process near the client runtime. +3. The MCP server connects to Redis using a normal Redis URL. +4. At startup, the server inspects `internal_knowledge` and reconstructs the schema. +5. The server applies any small override needed for incomplete vector metadata. +6. The configured vectorizer embeds user queries for vector or hybrid search. +7. Internal assistants call the MCP tool instead of talking to Redis directly. + +This pattern works across: + +- Redis Enterprise in a self-managed production environment +- Redis Cloud instances used by product teams +- open-source Redis Stack, including Docker-based local and CI environments + +The behavioral contract stays the same. The operational controls around networking, auth, and tenancy vary by deployment. + +## Example MCP Config + +```yaml +server: + redis_url: ${REDIS_URL} + +indexes: + knowledge: + redis_name: internal_knowledge + + vectorizer: + class: OpenAITextVectorizer + model: text-embedding-3-small + api_config: + api_key: ${OPENAI_API_KEY} + + schema_overrides: + fields: + - name: embedding + type: vector + attrs: + dims: 1536 + datatype: float32 + + search: + type: hybrid + params: + text_scorer: BM25STD + stopwords: english + vector_search_method: KNN + combination_method: LINEAR + linear_text_weight: 0.3 + + runtime: + text_field_name: content + vector_field_name: embedding + default_embed_text_field: content + default_limit: 8 + max_limit: 25 + skip_embedding_if_present: true + startup_timeout_seconds: 30 + request_timeout_seconds: 45 + max_concurrency: 16 +``` + +Run the server in read-only mode with the CLI flag or environment variable instead of YAML: + +```bash +uvx --from redisvl[mcp] rvl mcp --config /path/to/mcp_config.yaml --read-only +``` + +Why this is realistic: + +- The index already exists and is discovered automatically. +- The v1 config still targets one bound index, but the surrounding YAML shape can grow to multiple bindings later. +- The vectorizer is still configured manually. +- `schema_overrides` is available if Redis inspection does not fully reconstruct vector attrs. +- Runtime field mappings stay explicit so the MCP server does not guess among multiple text-like fields. +- Assistant clients are intentionally limited to read-only retrieval against the internal knowledge index. + +## Example Search Calls + +### Vector search for incident guidance + +Request: + +```json +{ + "query": "How do we mitigate elevated cache miss rate after a regional failover?", + "limit": 5, + "filter": { + "and": [ + { "field": "team", "op": "eq", "value": "platform" }, + { "field": "severity", "op": "in", "value": ["sev1", "sev2"] }, + { "field": "region", "op": "eq", "value": "eu-central" } + ] + }, + "return_fields": ["title", "content", "source_type", "last_reviewed_at"] +} +``` + +Why the enterprise customer cares: + +- Semantic retrieval finds similar operational incidents even when the exact wording differs. +- Filters keep the result set scoped to the right team, severity band, and region. + +### Hybrid search for release-note lookup + +Request: + +```json +{ + "query": "deprecation of legacy cache invalidation flow", + "limit": 3, + "filter": { + "field": "product", + "op": "eq", + "value": "developer-platform" + }, + "return_fields": ["title", "content", "release_version"] +} +``` + +Why the enterprise customer cares: + +- Hybrid search combines exact phrase hits with semantic similarity. +- The same MCP request works whether the server uses native Redis hybrid search or the `AggregateHybridQuery` fallback. +- The assistant can ground answers in specific release-note entries instead of relying on model memory. + +## Evaluation Checklist For Reviewers + +This example should make the value of the MCP design easy to evaluate: + +- The customer already has Redis indexes and wants to reuse them. +- The server discovers index structure instead of forcing duplicate schema definition. +- The vectorizer is still explicit, which keeps embedding behavior auditable. +- The same pattern applies across Enterprise, Cloud, and OSS deployments. +- The assistant-facing MCP surface can remain read-only even if the underlying index is maintained by separate ingestion systems. +- The scenario also illustrates why future multi-index support may matter as teams split distinct content domains into separate Redis indexes. +- The MCP layer standardizes how multiple assistants consume the same Redis retrieval system. diff --git a/spec/MCP.md b/spec/MCP.md new file mode 100644 index 00000000..e2fb9320 --- /dev/null +++ b/spec/MCP.md @@ -0,0 +1,768 @@ +--- +name: redisvl-mcp-server-spec +description: Implementation specification for a RedisVL MCP server with deterministic, agent-friendly contracts for development and testing. +metadata: + status: draft + audience: RedisVL maintainers and coding agents + objective: Define a deterministic, testable MCP server contract so agents can implement safely without relying on implicit behavior. +--- + +# RedisVL MCP Server Specification + +## Overview + +This specification defines a Model Context Protocol (MCP) server for RedisVL that allows MCP clients to search and upsert data in an existing Redis index. + +Search behavior is owned by server configuration. MCP clients provide query text, filtering, pagination, and field projection, but do not choose the search mode or runtime tuning parameters. + +The MCP design targets indexes hosted on open-source Redis Stack, Redis Cloud, or Redis Enterprise, provided the required Search capabilities are available for the configured tool behavior. + +The server is designed for stdio transport first and must be runnable via: + +```bash +uvx --from redisvl[mcp] rvl mcp --config /path/to/mcp_config.yaml +``` + +For a production-oriented usage narrative and end-to-end example, see [MCP-production-example.md](./MCP-production-example.md). + +### Goals + +1. Expose configured RedisVL search capabilities (`vector`, `fulltext`, `hybrid`) through stable MCP tools without requiring MCP clients to configure retrieval strategy. +2. Support controlled write access via an upsert tool. +3. Automatically reconstruct the index schema from an existing Redis index instead of requiring a full manual schema definition. +4. Keep the vectorizer configuration explicit and user-defined. +5. Provide deterministic contracts for tool inputs, outputs, and errors. +6. Align implementation with existing RedisVL architecture and CLI patterns. + +### Non-Goals (v1) + +1. Multi-index routing in a single server process. +2. Remote transports (SSE/HTTP). +3. Index creation or schema provisioning from MCP config. +4. Delete/count/info tools (future scope). +5. Automatic vectorizer selection from Redis metadata. + +--- + +## Compatibility Matrix + +These are hard compatibility expectations for v1. + +| Component | Requirement | Notes | +|----------|-------------|-------| +| Core RedisVL package | Python `>=3.9.2,<3.15` | Match current project constraints | +| MCP feature | Python `>=3.10,<3.15` | `redisvl[mcp]` may have a stricter floor than the core package | +| RedisVL | current repo version | Server lives inside this package | +| redis-py | `>=5.0,<7.2` | Already required by project | +| FastMCP server SDK | `fastmcp>=2.0.0` | Standalone FastMCP package used for server implementation | +| Redis server | Redis Stack / Redis with Search module | Required for all search modes | +| Hybrid search | Prefer native implementation on Redis `>=8.4.0` with redis-py `>=7.1.0`; otherwise fall back to `AggregateHybridQuery` | Hybrid search remains available across both paths | + +Notes: +- This spec standardizes on the standalone `fastmcp` package for server implementation. It does not assume the official `mcp` package is on a 2.x line. +- Client SDK examples may still use whichever client-side MCP package their ecosystem requires. +- Native hybrid support is preferred when available because it aligns with current Redis runtime capabilities, but lack of native support is not a blocker for `indexes..search.type=\"hybrid\"` when the configured search params remain compatible with the aggregate fallback. + +--- + +## Architecture + +### Module Structure + +```text +redisvl/ +├── mcp/ +│ ├── __init__.py +│ ├── server.py # RedisVLMCPServer +│ ├── settings.py # MCPSettings +│ ├── config.py # Config models + loader + validation +│ ├── errors.py # MCP error mapping helpers +│ ├── filters.py # Filter parser (DSL + raw string handling) +│ └── tools/ +│ ├── __init__.py +│ ├── search.py # search-records +│ └── upsert.py # upsert-records +└── cli/ + ├── main.py # Add `mcp` command dispatch + └── mcp.py # MCP command handler class +``` + +### Dependency Groups + +Add optional extras for explicit install intent. + +```toml +[project.optional-dependencies] +mcp = [ + "fastmcp>=2.0.0", + "pydantic-settings>=2.0", +] +``` + +Notes: +- `fulltext` and both hybrid implementations (`HybridQuery` and `AggregateHybridQuery`) rely on the same query-time stopword handling. If `nltk` is not installed and stopwords are enabled, server must return a structured dependency error. +- Provider vectorizer dependencies remain provider-specific (`openai`, `cohere`, `vertexai`, etc.). + +--- + +## Configuration + +Configuration is composed from environment + YAML: + +1. `MCPSettings` from env/CLI. +2. YAML file referenced by `config` setting. +3. Env substitution inside YAML with strict validation. + +The normal v1 path is inspection-first: the YAML identifies a single configured index binding, the server discovers that Redis index's schema at startup, and optional overrides patch only discovery gaps. The YAML shape is future-friendly for multi-index support even though v1 allows exactly one configured index. + +### Environment Variables + +| Variable | Type | Default | Description | +|----------|------|---------|-------------| +| `REDISVL_MCP_CONFIG` | str | required | Path to MCP YAML config | +| `REDISVL_MCP_READ_ONLY` | bool | `false` | If true, do not register upsert tool | +| `REDISVL_MCP_TOOL_SEARCH_DESCRIPTION` | str | default text | MCP tool description override | +| `REDISVL_MCP_TOOL_UPSERT_DESCRIPTION` | str | default text | MCP tool description override | + +### YAML Schema (Normative) + +```yaml +server: + redis_url: redis://localhost:6379 + +indexes: + knowledge: + redis_name: knowledge + + vectorizer: + class: OpenAITextVectorizer + model: text-embedding-3-small + # kwargs passed to vectorizer constructor + # for providers using api_config, pass as nested object: + # api_config: + # api_key: ${OPENAI_API_KEY} + + schema_overrides: + fields: + - name: embedding + type: vector + attrs: + dims: 1536 + datatype: float32 + + search: + type: hybrid + params: + text_scorer: BM25STD + stopwords: english + vector_search_method: KNN + combination_method: LINEAR + linear_text_weight: 0.3 + knn_ef_runtime: 150 + + runtime: + # required explicit field mapping for tool behavior + text_field_name: content + vector_field_name: embedding + default_embed_text_field: content + + # request constraints + default_limit: 10 + max_limit: 100 + max_upsert_records: 64 + + # default overwrite behavior for existing vectors + skip_embedding_if_present: true + + # timeouts + startup_timeout_seconds: 30 + request_timeout_seconds: 60 + + # server-side concurrency guard + max_concurrency: 16 +``` + +### Search Configuration (Normative) + +`indexes..search` defines the retrieval strategy for the sole bound index in v1. Tool callers must not override this configuration. + +Required fields: + +- `type`: `vector` | `fulltext` | `hybrid` +- `params`: optional object whose allowed keys depend on `type` + +Allowed `params` by `type`: + +- `vector` + - `hybrid_policy` + - `batch_size` + - `ef_runtime` + - `epsilon` + - `search_window_size` + - `use_search_history` + - `search_buffer_capacity` + - `normalize_vector_distance` +- `fulltext` + - `text_scorer` + - `stopwords` + - `text_weights` +- `hybrid` + - `text_scorer` + - `stopwords` + - `text_weights` + - `vector_search_method` + - `knn_ef_runtime` + - `range_radius` + - `range_epsilon` + - `combination_method` + - `rrf_window` + - `rrf_constant` + - `linear_text_weight` + +Normalization rules: + +1. `linear_text_weight` is the MCP config's stable meaning for linear hybrid fusion and always represents the text-side weight. +2. When building native `HybridQuery`, the server must pass `linear_text_weight` through as `linear_alpha`. +3. When building `AggregateHybridQuery`, the server must translate `linear_text_weight` to `alpha = 1 - linear_text_weight` so the config meaning does not change across implementations. +4. `linear_text_weight` is only valid when `combination_method` is `LINEAR`. +5. Hybrid configs using FT.SEARCH-only runtime params (`knn_ef_runtime`) must fail startup if the environment only supports the aggregate fallback path. + +### Schema Discovery and Override Rules + +1. `server.redis_url` is required. +2. `indexes` is required and must contain exactly one configured binding in v1. +3. The `indexes` mapping key is the logical binding id. It is stable for future routing and does not need to equal the Redis index name. +4. `indexes..redis_name` is required and must refer to an existing Redis index. +5. The server must reconstruct the base schema from Redis metadata, preferably via existing RedisVL inspection primitives built on `FT.INFO`. +6. `indexes..vectorizer` remains fully manual and is never inferred from Redis index metadata in v1. +7. `indexes..schema_overrides` is optional and exists only to supplement incomplete inspection data. +8. `indexes..search.type` is required and is authoritative for query construction. +9. `indexes..search.params` is optional but, when present, may only contain keys valid for the configured `search.type`. +10. Tool requests implicitly target the sole configured index binding and its configured search behavior in v1. No `index`, `search_type`, or search-tuning request parameters are exposed. +11. Tool callers may control only query text, filtering, pagination, and returned fields for `search-records`. +12. Discovered index identity is authoritative: + - `indexes..redis_name` + - storage type + - field identity (`name`, `type`, and `path` when applicable) +13. Overrides may: + - add missing attrs for a discovered field + - replace discovered attrs for a discovered field when needed for compatibility +14. Overrides must not: + - redefine index identity + - add entirely new fields that do not exist in the inspected index + - change a discovered field's `name`, `type`, or `path` +15. Override conflicts must fail startup with a config error. + +### Env Substitution Rules + +Supported patterns in YAML values: +- `${VAR}`: required variable. Fail startup if unset. +- `${VAR:-default}`: optional variable with fallback. + +Unresolved required vars must fail startup with config error. + +### Config Validation Rules + +Server startup must fail fast if: +1. Config file missing/unreadable. +2. YAML invalid. +3. `server.redis_url` missing or blank. +4. `indexes` missing, empty, or containing more than one entry. +5. The configured binding id is blank. +6. `indexes..redis_name` missing or blank. +7. `indexes..search.type` missing or not one of `vector`, `fulltext`, `hybrid`. +8. `indexes..search.params` contains keys that are incompatible with the configured `search.type`. +9. `indexes..search.params.linear_text_weight` is present without `combination_method: LINEAR`. +10. A hybrid config relies on FT.SEARCH-only runtime params and the environment only supports the aggregate fallback path. +11. The referenced Redis index does not exist. +12. Schema inspection fails and no valid `indexes..schema_overrides` resolve the issue. +13. `indexes..runtime.text_field_name` not in the effective schema. +14. `indexes..runtime.vector_field_name` not in the effective schema or not vector type. +15. `indexes..runtime.default_embed_text_field` not in the effective schema. +16. `default_limit <= 0` or `max_limit < default_limit`. +17. `max_upsert_records <= 0`. + +--- + +## Lifecycle and Resource Management + +### Startup Sequence (Normative) + +On server startup: + +1. Load settings and config. +2. Resolve the sole configured index binding from `indexes`. +3. Create or obtain an async Redis client using `server.redis_url`. +4. Validate Redis connectivity by performing a lightweight call (`info` or equivalent search operation). +5. Inspect the existing index named by `indexes..redis_name`, preferably via `AsyncSearchIndex.from_existing(...)` or an equivalent `FT.INFO`-backed flow. +6. Convert the inspected index metadata into an `IndexSchema`. +7. Apply any validated `indexes..schema_overrides` to produce the effective schema. +8. Instantiate `AsyncSearchIndex` from the effective schema. +9. Validate `indexes..search` against the effective schema and current runtime capabilities. +10. Instantiate the configured `indexes..vectorizer`. +11. Validate vectorizer dimensions against the effective vector field dims when available. +12. Register tools (omit upsert in read-only mode). + +If vector field attributes cannot be reconstructed from Redis metadata on the target Redis version, startup must fail with an actionable error unless `indexes..schema_overrides` provides the missing attrs. + +### Shutdown Sequence + +On shutdown, disconnect Redis client owned by `AsyncSearchIndex` and release vectorizer resources if applicable. + +### Concurrency Guard + +Tool executions are bounded by an async semaphore (`runtime.max_concurrency`). Requests exceeding capacity wait, then may timeout according to `request_timeout_seconds`. + +--- + +## Filter Contract (Normative) + +`search-records.filter` follows RedisVL convention and accepts either: +- `string`: raw RedisVL/Redis Search filter string (passed through to query filter). +- `object`: JSON DSL described below. + +### Operators + +- Logical: `and`, `or`, `not` +- Comparison: `eq`, `ne`, `gt`, `gte`, `lt`, `lte`, `in`, `like` +- Utility: `exists` + +### Atomic Expression Shape + +```json +{ "field": "category", "op": "eq", "value": "science" } +``` + +### Composite Shape + +```json +{ + "and": [ + { "field": "category", "op": "eq", "value": "science" }, + { + "or": [ + { "field": "rating", "op": "gte", "value": 4.5 }, + { "field": "category", "op": "eq", "value": "featured" } + ] + } + ] +} +``` + +### Parsing Rules + +1. Unknown `op` fails with `invalid_filter`. +2. Unknown `field` fails with `invalid_filter`. +3. Type mismatches fail with `invalid_filter`. +4. Empty logical arrays fail with `invalid_filter`. +5. Object DSL parser translates to `redisvl.query.filter.FilterExpression`. +6. String filter is treated as raw filter expression and passed through. + +--- + +## Tools + +## Tool: `search-records` + +Search records using the configured search behavior for the bound index. + +### Request Contract + +| Parameter | Type | Required | Default | Constraints | +|----------|------|----------|---------|-------------| +| `query` | str | yes | - | non-empty | +| `limit` | int | no | `runtime.default_limit` | `1..runtime.max_limit` | +| `offset` | int | no | `0` | `>=0` | +| `filter` | string \\| object | no | `null` | Raw RedisVL filter string or DSL object | +| `return_fields` | list[str] | no | all non-vector fields | Unknown fields rejected | + +### Response Contract + +```json +{ + "search_type": "vector", + "offset": 0, + "limit": 10, + "results": [ + { + "id": "doc:123", + "score": 0.93, + "score_type": "vector_distance_normalized", + "record": { + "content": "The document text...", + "category": "science" + } + } + ] +} +``` + +### Search Semantics + +- `search_type` in the response is informational metadata derived from `indexes..search.type`. +- `search-records` must reject deprecated client-side search-mode or tuning inputs with `invalid_request`. +- `vector`: embeds `query` with the configured vectorizer and builds `VectorQuery` using `indexes..search.params`. +- `fulltext`: builds `TextQuery` using `indexes..search.params`. +- `hybrid`: embeds `query` and selects the query implementation by runtime capability: + - use native `HybridQuery` when Redis `>=8.4.0` and redis-py `>=7.1.0` are available + - otherwise fall back to `AggregateHybridQuery` +- The MCP request/response contract for `hybrid` is identical across both implementation paths because config normalization hides class-specific fusion semantics from tool callers. +- In v1, `filter` is applied uniformly to the hybrid query rather than allowing separate text-side and vector-side filters. This is intentional to keep the API simple; future versions may expose finer-grained hybrid filtering controls. + +### Errors + +| Code | Meaning | Retryable | +|------|---------|-----------| +| `invalid_request` | bad query params | no | +| `invalid_filter` | filter parse/type failure | no | +| `dependency_missing` | missing optional lib/provider SDK | no | +| `backend_unavailable` | Redis unavailable/timeout | yes | +| `internal_error` | unexpected failure | maybe | + +--- + +## Tool: `upsert-records` + +Upsert records with automatic embedding. + +Not registered when read-only mode is enabled. + +### Request Contract + +| Parameter | Type | Required | Default | Constraints | +|----------|------|----------|---------|-------------| +| `records` | list[object] | yes | - | non-empty and `len(records) <= runtime.max_upsert_records` | +| `id_field` | str | no | `null` | if set, must exist in every record | +| `skip_embedding_if_present` | bool | no | `runtime.skip_embedding_if_present` | if false, always re-embed | + +### Response Contract + +```json +{ + "status": "success", + "keys_upserted": 3, + "keys": ["doc:abc123", "doc:def456", "doc:ghi789"] +} +``` + +### Upsert Semantics + +1. Validate input records before writing. +2. Use `runtime.default_embed_text_field` for records that require embedding. +3. Respect `skip_embedding_if_present` (default true): only generate embeddings for records missing configured vector field. +4. Populate configured vector field. +5. Call `AsyncSearchIndex.load`. + +### Error Semantics + +- Validation failures return `invalid_request`. +- Provider errors return `dependency_missing` or `internal_error` with actionable message. +- Redis write failures return `backend_unavailable`. +- On write failure, response must include `partial_write_possible: true` (conservative signal). + +--- + +## Server Implementation + +### Core Class Contract + +```python +class RedisVLMCPServer(FastMCP): + settings: MCPSettings + config: MCPConfig + + async def startup(self) -> None: ... + async def shutdown(self) -> None: ... + + async def get_index(self) -> AsyncSearchIndex: ... + async def get_vectorizer(self): ... +``` + +Tool implementations must always call `await server.get_index()` and `await server.get_vectorizer()`; never read uninitialized attributes directly. + +### Field Mapping Requirements + +For the sole configured binding in v1, the server owns these validated values: +- `text_field_name` +- `vector_field_name` +- `default_embed_text_field` +- `search.type` +- `search.params` + +Schema discovery is automatic in v1. Field mapping is not. Search construction is configuration-owned. Runtime field mappings remain explicit so the server does not guess among multiple valid text or vector fields, and MCP callers do not choose retrieval mode or tuning. + +--- + +## CLI Integration + +Current RedisVL CLI is command-dispatch based (not argparse subparsers), so MCP integration must follow existing pattern. + +### User Commands + +```bash +rvl mcp --config path/to/mcp_config.yaml +rvl mcp --config path/to/mcp_config.yaml --read-only +``` + +### Required CLI Changes + +1. Add `mcp` command to usage/help in `redisvl/cli/main.py`. +2. Add `RedisVlCLI.mcp()` method that dispatches to new `MCP` handler class. +3. Implement `redisvl/cli/mcp.py` similar to existing command modules. +4. Gracefully report missing optional deps (`pip install redisvl[mcp]`). +5. Clearly report when the current Python runtime is unsupported for the MCP extra. + +--- + +## Client Configuration Examples + +### Claude Desktop + +```json +{ + "mcpServers": { + "redisvl": { + "command": "uvx", + "args": ["--from", "redisvl[mcp]", "rvl", "mcp", "--config", "/path/to/mcp_config.yaml"], + "env": { + "OPENAI_API_KEY": "sk-..." + } + } + } +} +``` + +### Claude Agents SDK (Python) + +```python +from agents import Agent +from agents.mcp import MCPServerStdio + +async def main(): + async with MCPServerStdio( + command="uvx", + args=["--from", "redisvl[mcp]", "rvl", "mcp", "--config", "mcp_config.yaml"], + ) as server: + agent = Agent( + name="search-agent", + instructions="Search and maintain Redis-backed knowledge using the server-configured retrieval strategy.", + mcp_servers=[server], + ) +``` + +### Google ADK (Python) + +```python +from google.adk.agents import LlmAgent +from google.adk.tools.mcp_tool import McpToolset +from google.adk.tools.mcp_tool.mcp_session_manager import StdioConnectionParams +from mcp import StdioServerParameters + +root_agent = LlmAgent( + model="gemini-2.0-flash", + name="redis_search_agent", + instruction="Search and maintain Redis-backed knowledge using the server-configured retrieval strategy.", + tools=[ + McpToolset( + connection_params=StdioConnectionParams( + server_params=StdioServerParameters( + command="uvx", + args=["--from", "redisvl[mcp]", "rvl", "mcp", "--config", "/path/to/mcp_config.yaml"], + env={ + "OPENAI_API_KEY": "sk-..." # Or other vectorizer API key + } + ), + ), + # Optional: filter to specific tools + # tool_filter=["search-records"] + ) + ], +) +``` + +### n8n + +n8n supports MCP servers via the MCP Server Trigger node. Configure the RedisVL MCP server as an external MCP tool source: + +1. **Using SSE transport** (if supported in future versions): + ```json + { + "mcpServers": { + "redisvl": { + "url": "http://localhost:9000/sse" + } + } + } + ``` + +2. **Using stdio transport** (via n8n's Execute Command node as a workaround): + Configure a workflow that spawns the MCP server process: + ```bash + uvx --from redisvl[mcp] rvl mcp --config /path/to/mcp_config.yaml + ``` + +Note: Full n8n MCP client support depends on n8n's MCP implementation. Refer to [n8n MCP documentation](https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-langchain.mcptrigger/) for current capabilities. + +--- + +## Observability and Security + +### Logging + +- Use structured logs with operation name, latency, and error code. +- Never log secrets (API keys, auth headers, full DSNs with credentials). +- Log config path but not raw config values for sensitive keys. + +### Timeouts + +- Startup timeout: `runtime.startup_timeout_seconds` +- Tool request timeout: `runtime.request_timeout_seconds` + +### Secret Handling + +- Support env-injected secrets via `${VAR}` substitution. +- Fail fast for required missing vars. + +--- + +## Testing Strategy + +### Unit Tests (`tests/unit/test_mcp/`) + +- `test_settings.py` + - env parsing and overrides + - read-only behavior +- `test_config.py` + - YAML validation + - env substitution success/failure + - schema inspection merge and override validation + - field mapping validation + - `indexes..search` validation by type + - normalized hybrid fusion validation +- `test_filters.py` + - DSL parsing, invalid operators, type mismatches +- `test_errors.py` + - internal exception -> MCP error code mapping + +### Integration Tests (`tests/integration/test_mcp/`) + +- `test_server_startup.py` + - startup success path against the sole configured index binding + - missing index failure + - vector field inspection gap resolved by `indexes..schema_overrides` + - conflicting override failure + - hybrid config with FT.SEARCH-only params rejected when only aggregate fallback is available +- `test_search_tool.py` + - configured `vector` / `fulltext` / `hybrid` success paths + - request without `search_type` succeeds + - deprecated client-side search-mode or tuning params rejected with `invalid_request` + - response reports configured `search_type` + - native hybrid path on Redis `>=8.4.0` + - aggregate hybrid fallback path on older supported runtimes when config is compatible + - pagination and field projection + - filter behavior +- `test_upsert_tool.py` + - insert/update success + - id_field validation failures + - read-only mode excludes tool + +### Deterministic Verification Commands + +```bash +uv run python -m pytest tests/unit/test_mcp -q +uv run python -m pytest tests/integration/test_mcp -q +``` + +--- + +## Implementation Plan and DoD + +### Phase 1: Framework + +Deliverables: +1. `redisvl/mcp/` scaffolding. +2. Config/settings models with strict validation. +3. Inspection-first startup/shutdown lifecycle. +4. Error mapping helpers. + +DoD: +1. Server boots successfully with valid config against the sole configured index binding. +2. Server fails fast with actionable config errors. +3. Unit tests for config/settings pass. + +### Phase 2: Search Tool + +Deliverables: +1. `search-records` request/response contract. +2. Filter parser (JSON DSL + raw string pass-through). +3. Config-owned search construction and hybrid query selection between native and aggregate implementations. + +DoD: +1. All search modes tested. +2. Invalid filter returns `invalid_filter`. +3. Deprecated client-side search-mode and tuning inputs return `invalid_request`. +4. `hybrid` uses native execution when available and `AggregateHybridQuery` otherwise, without changing the MCP contract or the meaning of `linear_text_weight`. + +### Phase 3: Upsert Tool + +Deliverables: +1. `upsert-records` implementation. +2. Record pre-validation. +3. Read-only exclusion. + +DoD: +1. Upsert works end-to-end. +2. Invalid records fail before writes. +3. Read-only mode verified. + +### Phase 4: CLI and Packaging + +Deliverables: +1. `rvl mcp` command via current CLI pattern. +2. Optional dependency group updates. +3. User-facing error messages for missing extras and unsupported Python runtime. + +DoD: +1. `uvx --from redisvl[mcp] rvl mcp --config ...` runs successfully. +2. CLI help includes `mcp` command. + +### Phase 5: Documentation + +Deliverables: +1. Config reference and examples. +2. Client setup examples. +3. Troubleshooting guide with common errors and fixes. + +DoD: +1. Docs reflect normative contracts in this spec. +2. Client-facing examples do not imply MCP callers choose retrieval mode. + +--- + +## Risks and Mitigations + +1. Runtime mismatch for hybrid search. + - Native hybrid requires newer Redis and redis-py capabilities, while older supported environments may still need the aggregate fallback path. + - Mitigation: explicitly detect runtime capability, reject incompatible hybrid configs at startup, and otherwise select native `HybridQuery` or `AggregateHybridQuery` deterministically. +2. Dependency drift across provider vectorizers. + - Mitigation: dependency matrix and startup validation. +3. Search behavior drift caused by client-selected tuning. + - Mitigation: keep search mode and query construction params in config, not in the MCP request surface. +4. Hidden partial writes during failures. + - Mitigation: conservative `partial_write_possible` signaling. +5. Incomplete schema reconstruction on older Redis versions. + - `FT.INFO` may not return enough vector metadata on some older Redis versions to fully reconstruct vector field attrs. + - Mitigation: fail fast with an actionable error and support targeted `indexes..schema_overrides` for missing attrs. +6. Hybrid fusion semantics differ between `HybridQuery` and `AggregateHybridQuery`. + - Native `HybridQuery` uses text-weight semantics while `AggregateHybridQuery` exposes vector-weight semantics. + - Mitigation: normalize on `linear_text_weight` in MCP config and translate internally per execution path. +7. Security and deployment limitations (v1 scope). + - This implementation is stdio-first and not production-hardened by itself. It does not include: + - Authentication/authorization mechanisms. + - Remote transports (SSE/HTTP) that would enable multi-tenant or networked deployments. + - Rate limiting or request validation beyond basic input constraints. + - Mitigation: Document clearly that v1 can be used against Redis Enterprise, Redis Cloud, or OSS Redis deployments, but production use requires the operator to supply the surrounding controls for auth, process isolation, and network boundaries. Users wanting built-in remote transport and auth should wait for future RedisVL MCP versions. + - For production deployments requiring authentication, users can: + - Deploy behind an authenticating proxy. + - Use environment-based secrets for Redis and vectorizer credentials. + - Restrict network access to the MCP server process. diff --git a/tests/integration/test_mcp/test_search_tool.py b/tests/integration/test_mcp/test_search_tool.py new file mode 100644 index 00000000..a5eaf8f3 --- /dev/null +++ b/tests/integration/test_mcp/test_search_tool.py @@ -0,0 +1,307 @@ +from pathlib import Path + +import pytest +import yaml + +from redisvl.index import AsyncSearchIndex +from redisvl.mcp.errors import MCPErrorCode, RedisVLMCPError +from redisvl.mcp.server import RedisVLMCPServer +from redisvl.mcp.settings import MCPSettings +from redisvl.mcp.tools.search import search_records +from redisvl.redis.connection import is_version_gte +from redisvl.redis.utils import array_to_buffer +from redisvl.schema import IndexSchema +from tests.conftest import get_redis_version_async, skip_if_redis_version_below_async + + +class FakeVectorizer: + def __init__(self, model: str, dims: int = 3, **kwargs): + self.model = model + self.dims = dims + self.kwargs = kwargs + + def embed(self, content: str = "", **kwargs): + del content, kwargs + return [0.1, 0.1, 0.5] + + +@pytest.fixture +async def searchable_index(async_client, worker_id): + schema = IndexSchema.from_dict( + { + "index": { + "name": f"mcp-search-{worker_id}", + "prefix": f"mcp-search:{worker_id}", + "storage_type": "hash", + }, + "fields": [ + {"name": "content", "type": "text"}, + {"name": "category", "type": "tag"}, + {"name": "rating", "type": "numeric"}, + { + "name": "embedding", + "type": "vector", + "attrs": { + "algorithm": "flat", + "dims": 3, + "distance_metric": "cosine", + "datatype": "float32", + }, + }, + ], + } + ) + index = AsyncSearchIndex(schema=schema, redis_client=async_client) + await index.create(overwrite=True, drop=True) + + def preprocess(record: dict) -> dict: + return { + **record, + "embedding": array_to_buffer(record["embedding"], "float32"), + } + + await index.load( + [ + { + "id": f"doc:{worker_id}:1", + "content": "science article about planets", + "category": "science", + "rating": 5, + "embedding": [0.1, 0.1, 0.5], + }, + { + "id": f"doc:{worker_id}:2", + "content": "medical science and health", + "category": "health", + "rating": 4, + "embedding": [0.1, 0.1, 0.4], + }, + { + "id": f"doc:{worker_id}:3", + "content": "sports update and scores", + "category": "sports", + "rating": 3, + "embedding": [-0.2, 0.1, 0.0], + }, + ], + preprocess=preprocess, + ) + + yield index + + await index.delete(drop=True) + + +@pytest.fixture +def mcp_config_path(tmp_path: Path, redis_url: str): + def factory(redis_name: str, search: dict) -> str: + config = { + "server": {"redis_url": redis_url}, + "indexes": { + "knowledge": { + "redis_name": redis_name, + "vectorizer": { + "class": "FakeVectorizer", + "model": "fake-model", + "dims": 3, + }, + "search": search, + "runtime": { + "text_field_name": "content", + "vector_field_name": "embedding", + "default_embed_text_field": "content", + "default_limit": 2, + "max_limit": 5, + }, + } + }, + } + config_path = tmp_path / f"{redis_name}-{search['type']}.yaml" + config_path.write_text(yaml.safe_dump(config), encoding="utf-8") + return str(config_path) + + return factory + + +@pytest.fixture +async def started_server(monkeypatch, searchable_index, mcp_config_path): + monkeypatch.setattr( + "redisvl.mcp.server.resolve_vectorizer_class", + lambda class_name: FakeVectorizer, + ) + + async def factory(search: dict) -> RedisVLMCPServer: + server = RedisVLMCPServer( + MCPSettings( + config=mcp_config_path(searchable_index.schema.index.name, search) + ) + ) + await server.startup() + return server + + servers = [] + + async def started(search: dict) -> RedisVLMCPServer: + server = await factory(search) + servers.append(server) + return server + + yield started + + for server in servers: + await server.shutdown() + + +@pytest.mark.asyncio +async def test_search_records_vector_success_with_pagination_and_projection( + started_server, +): + server = await started_server( + { + "type": "vector", + "params": {"normalize_vector_distance": True}, + } + ) + + response = await search_records( + server, + query="science", + limit=1, + offset=1, + return_fields=["content", "category"], + ) + + assert response["search_type"] == "vector" + assert response["offset"] == 1 + assert response["limit"] == 1 + assert len(response["results"]) == 1 + assert response["results"][0]["score_type"] == "vector_distance_normalized" + assert set(response["results"][0]["record"]) == {"content", "category"} + + +@pytest.mark.asyncio +async def test_search_records_fulltext_success(started_server): + server = await started_server( + { + "type": "fulltext", + "params": { + "text_scorer": "BM25STD.NORM", + "stopwords": None, + }, + } + ) + + response = await search_records( + server, + query="science", + return_fields=["content", "category"], + ) + + assert response["search_type"] == "fulltext" + assert response["results"] + assert response["results"][0]["score_type"] == "text_score" + assert response["results"][0]["score"] is not None + assert "science" in response["results"][0]["record"]["content"] + + +@pytest.mark.asyncio +async def test_search_records_respects_raw_string_filter(started_server): + server = await started_server({"type": "vector"}) + + response = await search_records( + server, + query="science", + filter="@category:{science}", + return_fields=["content", "category"], + ) + + assert response["results"] + assert all( + result["record"]["category"] == "science" for result in response["results"] + ) + + +@pytest.mark.asyncio +async def test_search_records_respects_dsl_filter(started_server): + server = await started_server({"type": "vector"}) + + response = await search_records( + server, + query="science", + filter={"field": "rating", "op": "gte", "value": 4.5}, + return_fields=["content", "category", "rating"], + ) + + assert response["results"] + assert all( + float(result["record"]["rating"]) >= 4.5 for result in response["results"] + ) + + +@pytest.mark.asyncio +async def test_search_records_invalid_filter_returns_invalid_filter(started_server): + server = await started_server({"type": "vector"}) + + with pytest.raises(RedisVLMCPError) as exc_info: + await search_records( + server, + query="science", + filter={"field": "missing", "op": "eq", "value": "science"}, + ) + + assert exc_info.value.code == MCPErrorCode.INVALID_FILTER + + +@pytest.mark.asyncio +async def test_search_records_native_hybrid_success(started_server, async_client): + await skip_if_redis_version_below_async(async_client, "8.4.0") + server = await started_server( + { + "type": "hybrid", + "params": { + "combination_method": "LINEAR", + "linear_text_weight": 0.3, + "stopwords": None, + }, + } + ) + + response = await search_records( + server, + query="science", + return_fields=["content", "category"], + ) + + assert response["search_type"] == "hybrid" + assert response["results"] + assert response["results"][0]["score_type"] == "hybrid_score" + assert response["results"][0]["score"] is not None + + +@pytest.mark.asyncio +async def test_search_records_fallback_hybrid_success(started_server, async_client): + redis_version = await get_redis_version_async(async_client) + if is_version_gte(redis_version, "8.4.0"): + pytest.skip(f"Redis version {redis_version} uses native hybrid search") + + server = await started_server( + { + "type": "hybrid", + "params": { + "combination_method": "LINEAR", + "linear_text_weight": 0.3, + "stopwords": None, + }, + } + ) + + response = await search_records( + server, + query="science", + return_fields=["content", "category"], + ) + + assert response["search_type"] == "hybrid" + assert response["results"] + assert response["results"][0]["score_type"] == "hybrid_score" + assert response["results"][0]["score"] is not None diff --git a/tests/integration/test_mcp/test_server_startup.py b/tests/integration/test_mcp/test_server_startup.py new file mode 100644 index 00000000..953aa6df --- /dev/null +++ b/tests/integration/test_mcp/test_server_startup.py @@ -0,0 +1,390 @@ +from pathlib import Path +from typing import Optional + +import pytest +import yaml + +from redisvl.index import AsyncSearchIndex +from redisvl.mcp.server import RedisVLMCPServer +from redisvl.mcp.settings import MCPSettings +from redisvl.redis.connection import is_version_gte +from redisvl.schema import IndexSchema +from tests.conftest import get_redis_version_async + + +class FakeVectorizer: + def __init__(self, model: str, dims: int = 3, **kwargs): + self.model = model + self.dims = dims + self.kwargs = kwargs + + +class FailingAsyncCloseVectorizer(FakeVectorizer): + async def aclose(self): + raise RuntimeError("vectorizer close failed") + + +@pytest.fixture +async def existing_index(async_client, worker_id): + created_indexes = [] + + async def factory( + *, + index_name: str, + storage_type: str = "hash", + vector_path: Optional[str] = None, + ) -> AsyncSearchIndex: + fields = [{"name": "content", "type": "text"}] + vector_field = { + "name": "embedding", + "type": "vector", + "attrs": { + "algorithm": "flat", + "dims": 3, + "distance_metric": "cosine", + "datatype": "float32", + }, + } + if storage_type == "json": + fields[0]["path"] = "$.content" + vector_field["path"] = vector_path or "$.embedding" + + fields.append(vector_field) + schema = IndexSchema.from_dict( + { + "index": { + "name": f"{index_name}-{worker_id}", + "prefix": f"{index_name}:{worker_id}", + "storage_type": storage_type, + }, + "fields": fields, + } + ) + index = AsyncSearchIndex(schema=schema, redis_client=async_client) + await index.create(overwrite=True, drop=True) + created_indexes.append(index) + return index + + yield factory + + for index in created_indexes: + try: + await index.delete(drop=True) + except Exception: + pass + + +@pytest.fixture +def mcp_config_path(tmp_path: Path, redis_url: str): + def factory( + *, + redis_name: str, + vector_dims: int = 3, + schema_overrides: Optional[dict] = None, + runtime_overrides: Optional[dict] = None, + search: Optional[dict] = None, + ) -> str: + runtime = { + "text_field_name": "content", + "vector_field_name": "embedding", + "default_embed_text_field": "content", + } + if runtime_overrides: + runtime.update(runtime_overrides) + + config = { + "server": {"redis_url": redis_url}, + "indexes": { + "knowledge": { + "redis_name": redis_name, + "vectorizer": { + "class": "FakeVectorizer", + "model": "fake-model", + "dims": vector_dims, + }, + "search": search or {"type": "vector"}, + "runtime": runtime, + } + }, + } + if schema_overrides is not None: + config["indexes"]["knowledge"]["schema_overrides"] = schema_overrides + + config_path = tmp_path / f"{redis_name}.yaml" + config_path.write_text(yaml.safe_dump(config), encoding="utf-8") + return str(config_path) + + return factory + + +@pytest.mark.asyncio +async def test_server_startup_success(monkeypatch, existing_index, mcp_config_path): + index = await existing_index(index_name="mcp-startup") + monkeypatch.setattr( + "redisvl.mcp.server.resolve_vectorizer_class", + lambda class_name: FakeVectorizer, + ) + server = RedisVLMCPServer( + MCPSettings(config=mcp_config_path(redis_name=index.name)) + ) + + await server.startup() + + started_index = await server.get_index() + vectorizer = await server.get_vectorizer() + + assert await started_index.exists() is True + assert started_index.schema.index.name == index.name + assert vectorizer.dims == 3 + + await server.shutdown() + + +@pytest.mark.asyncio +async def test_server_fails_when_hybrid_config_requires_native_runtime( + monkeypatch, existing_index, mcp_config_path, async_client +): + redis_version = await get_redis_version_async(async_client) + if is_version_gte(redis_version, "8.4.0"): + pytest.skip(f"Redis version {redis_version} supports native hybrid search") + + index = await existing_index(index_name="mcp-native-required") + monkeypatch.setattr( + "redisvl.mcp.server.resolve_vectorizer_class", + lambda class_name: FakeVectorizer, + ) + server = RedisVLMCPServer( + MCPSettings( + config=mcp_config_path( + redis_name=index.name, + search={ + "type": "hybrid", + "params": { + "vector_search_method": "KNN", + "knn_ef_runtime": 150, + }, + }, + ) + ) + ) + + with pytest.raises(ValueError, match="knn_ef_runtime"): + await server.startup() + + +@pytest.mark.asyncio +async def test_server_fails_when_configured_index_is_missing( + monkeypatch, mcp_config_path, worker_id +): + monkeypatch.setattr( + "redisvl.mcp.server.resolve_vectorizer_class", + lambda class_name: FakeVectorizer, + ) + server = RedisVLMCPServer( + MCPSettings(config=mcp_config_path(redis_name=f"missing-{worker_id}")) + ) + + with pytest.raises(ValueError, match="does not exist"): + await server.startup() + + +@pytest.mark.asyncio +async def test_server_uses_schema_overrides_when_inspection_is_incomplete( + monkeypatch, existing_index, mcp_config_path +): + index = await existing_index(index_name="mcp-overrides") + monkeypatch.setattr( + "redisvl.mcp.server.resolve_vectorizer_class", + lambda class_name: FakeVectorizer, + ) + original_info = AsyncSearchIndex._info + + async def incomplete_info(name, redis_client): + info = await original_info(name, redis_client) + for field in info["attributes"]: + if "VECTOR" in field: + del field[6:] + return info + + monkeypatch.setattr( + "redisvl.mcp.server.AsyncSearchIndex._info", + staticmethod(incomplete_info), + ) + server = RedisVLMCPServer( + MCPSettings( + config=mcp_config_path( + redis_name=index.name, + schema_overrides={ + "fields": [ + { + "name": "embedding", + "type": "vector", + "attrs": { + "algorithm": "flat", + "dims": 3, + "datatype": "float32", + "distance_metric": "cosine", + }, + } + ] + }, + ) + ) + ) + + await server.startup() + + started_index = await server.get_index() + assert started_index.schema.fields["embedding"].attrs.dims == 3 + + await server.shutdown() + + +@pytest.mark.asyncio +async def test_server_fails_on_conflicting_schema_override( + monkeypatch, existing_index, mcp_config_path +): + index = await existing_index( + index_name="mcp-conflict", + storage_type="json", + vector_path="$.embedding", + ) + monkeypatch.setattr( + "redisvl.mcp.server.resolve_vectorizer_class", + lambda class_name: FakeVectorizer, + ) + server = RedisVLMCPServer( + MCPSettings( + config=mcp_config_path( + redis_name=index.name, + schema_overrides={ + "fields": [ + { + "name": "embedding", + "type": "vector", + "path": "$.other_embedding", + } + ] + }, + ) + ) + ) + + with pytest.raises(ValueError, match="cannot change discovered field path"): + await server.startup() + + +@pytest.mark.asyncio +async def test_server_fails_fast_on_vector_dimension_mismatch( + monkeypatch, existing_index, mcp_config_path +): + index = await existing_index(index_name="mcp-dims") + monkeypatch.setattr( + "redisvl.mcp.server.resolve_vectorizer_class", + lambda class_name: FakeVectorizer, + ) + server = RedisVLMCPServer( + MCPSettings(config=mcp_config_path(redis_name=index.name, vector_dims=8)) + ) + + with pytest.raises(ValueError, match="Vectorizer dims"): + await server.startup() + + +@pytest.mark.asyncio +async def test_server_startup_failure_disconnects_index( + monkeypatch, existing_index, mcp_config_path +): + index = await existing_index(index_name="mcp-startup-failure") + monkeypatch.setattr( + "redisvl.mcp.server.resolve_vectorizer_class", + lambda class_name: FakeVectorizer, + ) + original_disconnect = AsyncSearchIndex.disconnect + disconnect_called = False + + async def tracked_disconnect(self): + nonlocal disconnect_called + disconnect_called = True + await original_disconnect(self) + + monkeypatch.setattr( + "redisvl.mcp.server.AsyncSearchIndex.disconnect", + tracked_disconnect, + ) + server = RedisVLMCPServer( + MCPSettings(config=mcp_config_path(redis_name=index.name, vector_dims=8)) + ) + + with pytest.raises(ValueError, match="Vectorizer dims"): + await server.startup() + + assert disconnect_called is True + + +@pytest.mark.asyncio +async def test_server_shutdown_disconnects_owned_client( + monkeypatch, existing_index, mcp_config_path +): + index = await existing_index(index_name="mcp-shutdown") + monkeypatch.setattr( + "redisvl.mcp.server.resolve_vectorizer_class", + lambda class_name: FakeVectorizer, + ) + server = RedisVLMCPServer( + MCPSettings(config=mcp_config_path(redis_name=index.name)) + ) + + await server.startup() + started_index = await server.get_index() + + assert started_index.client is not None + + await server.shutdown() + + assert started_index.client is None + + +@pytest.mark.asyncio +async def test_server_get_index_fails_after_shutdown( + monkeypatch, existing_index, mcp_config_path +): + index = await existing_index(index_name="mcp-get-index-after-shutdown") + monkeypatch.setattr( + "redisvl.mcp.server.resolve_vectorizer_class", + lambda class_name: FakeVectorizer, + ) + server = RedisVLMCPServer( + MCPSettings(config=mcp_config_path(redis_name=index.name)) + ) + + await server.startup() + await server.shutdown() + + with pytest.raises(RuntimeError, match="has not been started"): + await server.get_index() + + +@pytest.mark.asyncio +async def test_server_shutdown_disconnects_index_when_vectorizer_close_fails( + monkeypatch, existing_index, mcp_config_path +): + index = await existing_index(index_name="mcp-shutdown-failure") + monkeypatch.setattr( + "redisvl.mcp.server.resolve_vectorizer_class", + lambda class_name: FailingAsyncCloseVectorizer, + ) + server = RedisVLMCPServer( + MCPSettings(config=mcp_config_path(redis_name=index.name)) + ) + + await server.startup() + started_index = await server.get_index() + + with pytest.raises(RuntimeError, match="vectorizer close failed"): + await server.shutdown() + + assert started_index.client is None + + with pytest.raises(RuntimeError, match="has not been started"): + await server.get_vectorizer() diff --git a/tests/integration/test_mcp/test_upsert_tool.py b/tests/integration/test_mcp/test_upsert_tool.py new file mode 100644 index 00000000..819a0584 --- /dev/null +++ b/tests/integration/test_mcp/test_upsert_tool.py @@ -0,0 +1,340 @@ +from pathlib import Path +from typing import Any, Dict, List, Optional + +import pytest +import yaml + +from redisvl.index import AsyncSearchIndex +from redisvl.mcp.errors import MCPErrorCode, RedisVLMCPError +from redisvl.mcp.server import RedisVLMCPServer +from redisvl.mcp.settings import MCPSettings +from redisvl.mcp.tools.upsert import upsert_records +from redisvl.schema import IndexSchema + + +class RecordingVectorizer: + def __init__(self, model: str, dims: int = 3, **kwargs: Any) -> None: + self.model = model + self.dims = dims + self.kwargs = kwargs + self.aembed_many_inputs: List[List[str]] = [] + self.embed_many_inputs: List[List[str]] = [] + self.aembed_inputs: List[str] = [] + self.embed_inputs: List[str] = [] + + @staticmethod + def _vector_for(text: str) -> List[float]: + base = float(len(text)) + return [base, base + 0.1, base + 0.2] + + async def aembed(self, content: str = "", **kwargs: Any) -> List[float]: + del kwargs + self.aembed_inputs.append(content) + return self._vector_for(content) + + def embed(self, content: str = "", **kwargs: Any) -> List[float]: + del kwargs + self.embed_inputs.append(content) + return self._vector_for(content) + + async def aembed_many( + self, + contents: Optional[List[str]] = None, + texts: Optional[List[str]] = None, + **kwargs: Any, + ) -> List[List[float]]: + del kwargs + items = contents or texts or [] + self.aembed_many_inputs.append(list(items)) + return [self._vector_for(text) for text in items] + + def embed_many( + self, + contents: Optional[List[str]] = None, + texts: Optional[List[str]] = None, + **kwargs: Any, + ) -> List[List[float]]: + del kwargs + items = contents or texts or [] + self.embed_many_inputs.append(list(items)) + return [self._vector_for(text) for text in items] + + +@pytest.fixture +async def upsertable_index(async_client, worker_id): + schema = IndexSchema.from_dict( + { + "index": { + "name": f"mcp-upsert-{worker_id}", + "prefix": f"mcp-upsert:{worker_id}", + "storage_type": "hash", + }, + "fields": [ + {"name": "content", "type": "text"}, + {"name": "category", "type": "tag"}, + {"name": "rating", "type": "numeric"}, + { + "name": "embedding", + "type": "vector", + "attrs": { + "algorithm": "flat", + "dims": 3, + "distance_metric": "cosine", + "datatype": "float32", + }, + }, + ], + } + ) + index = AsyncSearchIndex(schema=schema, redis_client=async_client) + await index.create(overwrite=True, drop=True) + + yield index + + await index.delete(drop=True) + + +@pytest.fixture +def mcp_config_path(tmp_path: Path, redis_url: str): + def factory( + *, + redis_name: str, + read_only: bool = False, + runtime_overrides: Optional[Dict[str, Any]] = None, + ) -> str: + runtime = { + "text_field_name": "content", + "vector_field_name": "embedding", + "default_embed_text_field": "content", + "default_limit": 2, + "max_limit": 5, + "max_upsert_records": 64, + "skip_embedding_if_present": True, + } + if runtime_overrides: + runtime.update(runtime_overrides) + + config = { + "server": {"redis_url": redis_url}, + "indexes": { + "knowledge": { + "redis_name": redis_name, + "vectorizer": { + "class": "RecordingVectorizer", + "model": "fake-model", + "dims": 3, + }, + "search": {"type": "vector"}, + "runtime": runtime, + } + }, + } + config_path = tmp_path / ( + f"{redis_name}-{'readonly' if read_only else 'readwrite'}.yaml" + ) + config_path.write_text(yaml.safe_dump(config), encoding="utf-8") + return str(config_path) + + return factory + + +@pytest.fixture +async def started_server(monkeypatch, upsertable_index, mcp_config_path): + monkeypatch.setattr( + "redisvl.mcp.server.resolve_vectorizer_class", + lambda class_name: RecordingVectorizer, + ) + + servers: List[RedisVLMCPServer] = [] + + async def factory( + *, + read_only: bool = False, + runtime_overrides: Optional[Dict[str, Any]] = None, + ) -> RedisVLMCPServer: + server = RedisVLMCPServer( + MCPSettings( + config=mcp_config_path( + redis_name=upsertable_index.schema.index.name, + read_only=read_only, + runtime_overrides=runtime_overrides, + ) + ) + ) + await server.startup() + servers.append(server) + return server + + yield factory + + for server in servers: + await server.shutdown() + + +def _record_id_from_key(key: str) -> str: + return key.rsplit(":", 1)[-1] + + +@pytest.mark.asyncio +async def test_upsert_records_inserts_rows_into_hash_index( + started_server, upsertable_index +): + server = await started_server() + + records = [ + {"content": "first upserted document", "category": "science", "rating": 5}, + {"content": "second upserted document", "category": "health", "rating": 4}, + ] + + response = await upsert_records(server, records=records) + + assert response["status"] == "success" + assert response["keys_upserted"] == 2 + assert len(response["keys"]) == 2 + + vectorizer = await server.get_vectorizer() + assert vectorizer.aembed_many_inputs == [ + ["first upserted document", "second upserted document"] + ] + + stored = await upsertable_index.fetch(_record_id_from_key(response["keys"][0])) + assert stored is not None + assert stored["content"] == "first upserted document" + assert stored["category"] == "science" + + +@pytest.mark.asyncio +async def test_upsert_records_updates_existing_row_with_id_field( + started_server, upsertable_index +): + server = await started_server() + + first_response = await upsert_records( + server, + records=[ + { + "doc_id": "doc-1", + "content": "original content", + "category": "science", + "rating": 3, + } + ], + id_field="doc_id", + ) + + second_response = await upsert_records( + server, + records=[ + { + "doc_id": "doc-1", + "content": "updated content", + "category": "engineering", + "rating": 5, + } + ], + id_field="doc_id", + ) + + assert first_response["keys"] == second_response["keys"] + assert second_response["keys_upserted"] == 1 + + stored = await upsertable_index.fetch( + _record_id_from_key(second_response["keys"][0]) + ) + assert stored is not None + assert stored["content"] == "updated content" + assert stored["category"] == "engineering" + assert int(stored["rating"]) == 5 + + +@pytest.mark.asyncio +async def test_upsert_records_rejects_invalid_records_before_write( + monkeypatch, started_server +): + server = await started_server() + + called = False + + async def fail_load(*args: Any, **kwargs: Any) -> Any: + del args, kwargs + nonlocal called + called = True + raise AssertionError("load should not be called for invalid records") + + monkeypatch.setattr( + "redisvl.index.index.AsyncSearchIndex.load", + fail_load, + ) + + with pytest.raises(RedisVLMCPError) as exc_info: + await upsert_records( + server, + records=[{"category": "science"}], + ) + + assert exc_info.value.code == MCPErrorCode.INVALID_REQUEST + assert called is False + + +@pytest.mark.asyncio +async def test_read_only_mode_excludes_upsert_tool( + monkeypatch, upsertable_index, mcp_config_path +): + monkeypatch.setattr( + "redisvl.mcp.server.resolve_vectorizer_class", + lambda class_name: RecordingVectorizer, + ) + monkeypatch.setattr( + "redisvl.mcp.server.register_search_tool", + lambda server: None, + ) + + def fake_tool(*args: Any, **kwargs: Any): + del args, kwargs + + def decorator(func: Any) -> Any: + return func + + return decorator + + monkeypatch.setattr(RedisVLMCPServer, "tool", fake_tool, raising=False) + + called: List[bool] = [] + + def fake_register_upsert_tool(server: Any) -> None: + called.append(server.mcp_settings.read_only) + + monkeypatch.setattr( + "redisvl.mcp.server.register_upsert_tool", + fake_register_upsert_tool, + raising=False, + ) + + writeable_server = RedisVLMCPServer( + MCPSettings( + config=mcp_config_path( + redis_name=upsertable_index.schema.index.name, + ) + ) + ) + await writeable_server.startup() + try: + assert called == [False] + finally: + await writeable_server.shutdown() + + read_only_server = RedisVLMCPServer( + MCPSettings( + config=mcp_config_path( + redis_name=upsertable_index.schema.index.name, + read_only=True, + ), + read_only=True, + ) + ) + + await read_only_server.startup() + try: + assert called == [False] + finally: + await read_only_server.shutdown() diff --git a/tests/test_imports.py b/tests/test_imports.py index 4e3aa9b7..b8a0a90a 100644 --- a/tests/test_imports.py +++ b/tests/test_imports.py @@ -16,6 +16,10 @@ import traceback from typing import Iterable +# The MCP package depends on optional MCP extras, so import-sanity runs without +# those extras should skip it rather than fail noisily. +EXCLUDED_MODULE_PREFIXES = ("redisvl.mcp",) + def iter_modules(package_name: str) -> Iterable[str]: """Iterate over all modules in a package, including subpackages.""" @@ -34,6 +38,9 @@ def sanity_check_imports(package_name: str) -> int: failures = [] for fullname in iter_modules(package_name): + if fullname.startswith(EXCLUDED_MODULE_PREFIXES): + print(f"[SKIP] {fullname}") + continue try: importlib.import_module(fullname) print(f"[ OK ] {fullname}") diff --git a/tests/unit/test_cli_mcp.py b/tests/unit/test_cli_mcp.py new file mode 100644 index 00000000..c20b91b4 --- /dev/null +++ b/tests/unit/test_cli_mcp.py @@ -0,0 +1,301 @@ +import builtins +import importlib +import sys +import types +from collections import namedtuple + +import pytest + +from redisvl.cli.main import RedisVlCLI, _usage + + +def _import_cli_mcp(): + sys.modules.pop("redisvl.cli.mcp", None) + return importlib.import_module("redisvl.cli.mcp") + + +def _make_version_info(major, minor, micro=0): + version_info = namedtuple( + "VersionInfo", ["major", "minor", "micro", "releaselevel", "serial"] + ) + return version_info(major, minor, micro, "final", 0) + + +def _install_fake_redisvl_mcp(monkeypatch, settings_factory, server_factory): + fake_module = types.ModuleType("redisvl.mcp") + fake_module.MCPSettings = settings_factory + fake_module.RedisVLMCPServer = server_factory + monkeypatch.setitem(sys.modules, "redisvl.mcp", fake_module) + return fake_module + + +def test_usage_includes_mcp(): + assert "mcp" in _usage() + + +def test_cli_dispatches_mcp_command_lazily(monkeypatch): + calls = [] + fake_module = types.ModuleType("redisvl.cli.mcp") + + class FakeMCP(object): + def __init__(self): + calls.append(list(sys.argv)) + + fake_module.MCP = FakeMCP + monkeypatch.setitem(sys.modules, "redisvl.cli.mcp", fake_module) + monkeypatch.setattr(sys, "argv", ["rvl", "mcp", "--config", "/tmp/mcp.yaml"]) + + cli = RedisVlCLI.__new__(RedisVlCLI) + + with pytest.raises(SystemExit) as exc_info: + RedisVlCLI.mcp(cli) + + assert exc_info.value.code == 0 + assert calls == [["rvl", "mcp", "--config", "/tmp/mcp.yaml"]] + + +def test_mcp_command_rejects_unsupported_python(monkeypatch, capsys): + monkeypatch.delitem(sys.modules, "redisvl.mcp", raising=False) + monkeypatch.delitem(sys.modules, "redisvl.cli.mcp", raising=False) + monkeypatch.setattr(sys, "version_info", _make_version_info(3, 9, 18)) + original_import = builtins.__import__ + + def missing_mcp_import(name, globals=None, locals=None, fromlist=(), level=0): + if name == "redisvl.mcp" or name.startswith("redisvl.mcp."): + raise ModuleNotFoundError(name) + return original_import(name, globals, locals, fromlist, level) + + monkeypatch.setattr(builtins, "__import__", missing_mcp_import) + + module = _import_cli_mcp() + monkeypatch.setattr(sys, "argv", ["rvl", "mcp", "--config", "/tmp/mcp.yaml"]) + + with pytest.raises(SystemExit) as exc_info: + module.MCP() + + out = capsys.readouterr() + + assert exc_info.value.code == 1 + assert "3.10" in out.err or "3.10" in out.out + + +def test_mcp_command_reports_missing_optional_dependencies(monkeypatch, capsys): + monkeypatch.delitem(sys.modules, "redisvl.mcp", raising=False) + monkeypatch.delitem(sys.modules, "redisvl.cli.mcp", raising=False) + monkeypatch.setattr(sys, "version_info", _make_version_info(3, 11, 0)) + + original_import = builtins.__import__ + + def missing_mcp_import(name, globals=None, locals=None, fromlist=(), level=0): + if name == "redisvl.mcp" or name.startswith("redisvl.mcp."): + raise ModuleNotFoundError(name) + return original_import(name, globals, locals, fromlist, level) + + monkeypatch.setattr(builtins, "__import__", missing_mcp_import) + + module = _import_cli_mcp() + monkeypatch.setattr(sys, "argv", ["rvl", "mcp", "--config", "/tmp/mcp.yaml"]) + + with pytest.raises(SystemExit) as exc_info: + module.MCP() + + out = capsys.readouterr() + + assert exc_info.value.code == 1 + assert "redisvl[mcp]" in out.err or "redisvl[mcp]" in out.out + + +def test_mcp_help_includes_description_and_example(monkeypatch, capsys): + monkeypatch.delitem(sys.modules, "redisvl.cli.mcp", raising=False) + monkeypatch.setattr(sys, "argv", ["rvl", "mcp", "--help"]) + + module = _import_cli_mcp() + + with pytest.raises(SystemExit) as exc_info: + module.MCP() + + out = capsys.readouterr() + + assert exc_info.value.code == 0 + assert "Expose a configured Redis index to MCP clients" in out.out + assert "Use this command when wiring RedisVL into an MCP client" in out.out + assert ( + "uvx --from redisvl[mcp] rvl mcp --config /path/to/mcp_config.yaml" in out.out + ) + + +def test_mcp_command_preserves_env_read_only_when_flag_is_omitted(monkeypatch): + monkeypatch.delitem(sys.modules, "redisvl.cli.mcp", raising=False) + monkeypatch.delitem(sys.modules, "redisvl.mcp", raising=False) + monkeypatch.setattr(sys, "version_info", _make_version_info(3, 11, 0)) + monkeypatch.setattr(sys, "argv", ["rvl", "mcp", "--config", "/tmp/mcp.yaml"]) + + calls = [] + + class FakeSettings(object): + @classmethod + def from_env(cls, config=None, read_only=None): + calls.append(("settings", config, read_only)) + return cls() + + class FakeServer(object): + def __init__(self, settings): + self.settings = settings + + async def startup(self): + calls.append(("startup",)) + + async def run(self, transport="stdio"): + calls.append(("run", transport)) + + async def shutdown(self): + calls.append(("shutdown",)) + + _install_fake_redisvl_mcp(monkeypatch, FakeSettings, FakeServer) + module = _import_cli_mcp() + + with pytest.raises(SystemExit) as exc_info: + module.MCP() + + assert exc_info.value.code == 0 + assert calls == [ + ("settings", "/tmp/mcp.yaml", None), + ("startup",), + ("run", "stdio"), + ("shutdown",), + ] + + +def test_mcp_command_runs_startup_then_stdio_then_shutdown(monkeypatch): + monkeypatch.delitem(sys.modules, "redisvl.cli.mcp", raising=False) + monkeypatch.delitem(sys.modules, "redisvl.mcp", raising=False) + monkeypatch.setattr(sys, "version_info", _make_version_info(3, 11, 0)) + monkeypatch.setattr( + sys, "argv", ["rvl", "mcp", "--config", "/tmp/mcp.yaml", "--read-only"] + ) + + calls = [] + + class FakeSettings(object): + def __init__(self, config, read_only=False): + self.config = config + self.read_only = read_only + + @classmethod + def from_env(cls, config=None, read_only=None): + calls.append(("settings", config, read_only)) + return cls(config=config, read_only=read_only) + + class FakeServer(object): + def __init__(self, settings): + self.settings = settings + + async def startup(self): + calls.append(("startup", self.settings.config, self.settings.read_only)) + + async def run(self, transport="stdio"): + calls.append(("run", transport)) + + async def shutdown(self): + calls.append(("shutdown",)) + + _install_fake_redisvl_mcp(monkeypatch, FakeSettings, FakeServer) + module = _import_cli_mcp() + + with pytest.raises(SystemExit) as exc_info: + module.MCP() + + assert exc_info.value.code == 0 + assert calls == [ + ("settings", "/tmp/mcp.yaml", True), + ("startup", "/tmp/mcp.yaml", True), + ("run", "stdio"), + ("shutdown",), + ] + + +def test_mcp_command_reports_startup_failures(monkeypatch, capsys): + monkeypatch.delitem(sys.modules, "redisvl.cli.mcp", raising=False) + monkeypatch.delitem(sys.modules, "redisvl.mcp", raising=False) + monkeypatch.setattr(sys, "version_info", _make_version_info(3, 11, 0)) + monkeypatch.setattr(sys, "argv", ["rvl", "mcp", "--config", "/tmp/mcp.yaml"]) + + calls = [] + + class FakeSettings(object): + @classmethod + def from_env(cls, config=None, read_only=None): + calls.append(("settings", config, read_only)) + return cls() + + class FakeServer(object): + def __init__(self, settings): + self.settings = settings + + async def startup(self): + calls.append(("startup",)) + raise RuntimeError("boom") + + async def run(self, transport="stdio"): + calls.append(("run", transport)) + + async def shutdown(self): + calls.append(("shutdown",)) + + _install_fake_redisvl_mcp(monkeypatch, FakeSettings, FakeServer) + module = _import_cli_mcp() + + with pytest.raises(SystemExit) as exc_info: + module.MCP() + + out = capsys.readouterr() + + assert exc_info.value.code == 1 + assert calls == [("settings", "/tmp/mcp.yaml", None), ("startup",)] + assert "boom" in out.err or "boom" in out.out + + +def test_mcp_command_shuts_down_when_run_fails(monkeypatch, capsys): + monkeypatch.delitem(sys.modules, "redisvl.cli.mcp", raising=False) + monkeypatch.delitem(sys.modules, "redisvl.mcp", raising=False) + monkeypatch.setattr(sys, "version_info", _make_version_info(3, 11, 0)) + monkeypatch.setattr(sys, "argv", ["rvl", "mcp", "--config", "/tmp/mcp.yaml"]) + + calls = [] + + class FakeSettings(object): + @classmethod + def from_env(cls, config=None, read_only=None): + calls.append(("settings", config, read_only)) + return cls() + + class FakeServer(object): + def __init__(self, settings): + self.settings = settings + + async def startup(self): + calls.append(("startup",)) + + async def run(self, transport="stdio"): + calls.append(("run", transport)) + raise RuntimeError("run failed") + + async def shutdown(self): + calls.append(("shutdown",)) + + _install_fake_redisvl_mcp(monkeypatch, FakeSettings, FakeServer) + module = _import_cli_mcp() + + with pytest.raises(SystemExit) as exc_info: + module.MCP() + + out = capsys.readouterr() + + assert exc_info.value.code == 1 + assert calls == [ + ("settings", "/tmp/mcp.yaml", None), + ("startup",), + ("run", "stdio"), + ("shutdown",), + ] + assert "run failed" in out.err or "run failed" in out.out diff --git a/tests/unit/test_mcp/conftest.py b/tests/unit/test_mcp/conftest.py new file mode 100644 index 00000000..f5d7e2bc --- /dev/null +++ b/tests/unit/test_mcp/conftest.py @@ -0,0 +1,8 @@ +import pytest + + +@pytest.fixture(scope="session", autouse=True) +def redis_container(): + # Shadow the repo-wide autouse Redis container fixture so MCP unit tests stay + # pure-unit and do not require Docker; Redis coverage lives in integration tests. + yield None diff --git a/tests/unit/test_mcp/test_config.py b/tests/unit/test_mcp/test_config.py new file mode 100644 index 00000000..a524a52a --- /dev/null +++ b/tests/unit/test_mcp/test_config.py @@ -0,0 +1,359 @@ +from copy import deepcopy +from pathlib import Path + +import pytest +import yaml + +from redisvl.mcp.config import MCPConfig, load_mcp_config +from redisvl.schema import IndexSchema + + +def _valid_config() -> dict: + return { + "server": {"redis_url": "redis://localhost:6379"}, + "indexes": { + "knowledge": { + "redis_name": "docs-index", + "vectorizer": {"class": "FakeVectorizer", "model": "test-model"}, + "search": {"type": "vector"}, + "runtime": { + "text_field_name": "content", + "vector_field_name": "embedding", + "default_embed_text_field": "content", + }, + } + }, + } + + +def _inspected_schema() -> dict: + return { + "index": { + "name": "docs-index", + "prefix": "doc", + "storage_type": "hash", + }, + "fields": [ + {"name": "content", "type": "text"}, + { + "name": "embedding", + "type": "vector", + "attrs": { + "algorithm": "flat", + "dims": 3, + "distance_metric": "cosine", + "datatype": "float32", + }, + }, + ], + } + + +def test_load_mcp_config_file_not_found(): + with pytest.raises(FileNotFoundError): + load_mcp_config("/tmp/does-not-exist.yaml") + + +def test_load_mcp_config_invalid_yaml(tmp_path: Path): + config_path = tmp_path / "mcp.yaml" + config_path.write_text("server: [", encoding="utf-8") + + with pytest.raises(ValueError, match="Invalid MCP config YAML"): + load_mcp_config(str(config_path)) + + +def test_load_mcp_config_env_substitution(tmp_path: Path, monkeypatch): + config_path = tmp_path / "mcp.yaml" + config_path.write_text( + """ +server: + redis_url: ${REDIS_URL:-redis://localhost:6379} +indexes: + knowledge: + redis_name: docs-index + vectorizer: + class: FakeVectorizer + model: ${VECTOR_MODEL:-test-model} + api_config: + api_key: ${OPENAI_API_KEY} + search: + type: vector + runtime: + text_field_name: content + vector_field_name: embedding + default_embed_text_field: content +""".strip(), + encoding="utf-8", + ) + monkeypatch.setenv("OPENAI_API_KEY", "secret") + + config = load_mcp_config(str(config_path)) + + assert config.server.redis_url == "redis://localhost:6379" + assert config.binding_id == "knowledge" + assert config.redis_name == "docs-index" + assert config.vectorizer.class_name == "FakeVectorizer" + assert config.vectorizer.model == "test-model" + assert config.vectorizer.extra_kwargs == {"api_config": {"api_key": "secret"}} + + +def test_load_mcp_config_required_env_missing(tmp_path: Path, monkeypatch): + config_path = tmp_path / "mcp.yaml" + config_path.write_text( + """ +server: + redis_url: redis://localhost:6379 +indexes: + knowledge: + redis_name: docs-index + vectorizer: + class: FakeVectorizer + model: ${VECTOR_MODEL} + search: + type: vector + runtime: + text_field_name: content + vector_field_name: embedding + default_embed_text_field: content +""".strip(), + encoding="utf-8", + ) + monkeypatch.delenv("VECTOR_MODEL", raising=False) + + with pytest.raises(ValueError, match="Missing required environment variable"): + load_mcp_config(str(config_path)) + + +def test_mcp_config_requires_server_redis_url(): + config = _valid_config() + config["server"]["redis_url"] = "" + + with pytest.raises(ValueError, match="redis_url"): + MCPConfig.model_validate(config) + + +@pytest.mark.parametrize( + "indexes", + [ + {}, + { + "knowledge": deepcopy(_valid_config()["indexes"]["knowledge"]), + "other": deepcopy(_valid_config()["indexes"]["knowledge"]), + }, + ], +) +def test_mcp_config_validates_index_count(indexes): + config = _valid_config() + config["indexes"] = indexes + + with pytest.raises(ValueError, match="exactly one configured index binding"): + MCPConfig.model_validate(config) + + +def test_mcp_config_rejects_blank_binding_id(): + config = _valid_config() + config["indexes"] = {"": deepcopy(config["indexes"]["knowledge"])} + + with pytest.raises(ValueError, match="binding id"): + MCPConfig.model_validate(config) + + +def test_mcp_config_rejects_blank_redis_name(): + config = _valid_config() + config["indexes"]["knowledge"]["redis_name"] = "" + + with pytest.raises(ValueError, match="redis_name"): + MCPConfig.model_validate(config) + + +def test_mcp_config_binding_helpers(): + config = MCPConfig.model_validate(_valid_config()) + + assert config.binding_id == "knowledge" + assert config.binding.redis_name == "docs-index" + assert config.binding.search.type == "vector" + assert config.runtime.default_embed_text_field == "content" + assert config.vectorizer.class_name == "FakeVectorizer" + assert config.redis_name == "docs-index" + + +def test_mcp_config_merges_schema_overrides_into_inspection_result(): + config_dict = _valid_config() + config_dict["indexes"]["knowledge"]["schema_overrides"] = { + "fields": [ + { + "name": "embedding", + "type": "vector", + "attrs": { + "dims": 1536, + "datatype": "float32", + "distance_metric": "cosine", + }, + } + ] + } + inspected = _inspected_schema() + inspected["fields"][1]["attrs"] = {"algorithm": "flat"} + config = MCPConfig.model_validate(config_dict) + + schema = config.to_index_schema(inspected) + + assert isinstance(schema, IndexSchema) + assert schema.index.name == "docs-index" + assert schema.fields["embedding"].attrs.dims == 1536 + assert str(schema.fields["embedding"].attrs.algorithm).lower().endswith("flat") + + +def test_mcp_config_rejects_override_for_unknown_field(): + config_dict = _valid_config() + config_dict["indexes"]["knowledge"]["schema_overrides"] = { + "fields": [{"name": "missing", "type": "text"}] + } + config = MCPConfig.model_validate(config_dict) + + with pytest.raises(ValueError, match="schema_overrides.fields.*missing"): + config.to_index_schema(_inspected_schema()) + + +def test_mcp_config_rejects_override_type_conflict(): + config_dict = _valid_config() + config_dict["indexes"]["knowledge"]["schema_overrides"] = { + "fields": [{"name": "embedding", "type": "text"}] + } + config = MCPConfig.model_validate(config_dict) + + with pytest.raises(ValueError, match="cannot change discovered field type"): + config.to_index_schema(_inspected_schema()) + + +def test_mcp_config_rejects_override_path_conflict(): + config_dict = _valid_config() + config_dict["indexes"]["knowledge"]["schema_overrides"] = { + "fields": [{"name": "content", "type": "text", "path": "$.body"}] + } + inspected = { + "index": { + "name": "docs-index", + "prefix": "doc", + "storage_type": "json", + }, + "fields": [ + {"name": "content", "type": "text", "path": "$.content"}, + { + "name": "embedding", + "type": "vector", + "path": "$.embedding", + "attrs": { + "algorithm": "flat", + "dims": 3, + "distance_metric": "cosine", + "datatype": "float32", + }, + }, + ], + } + config = MCPConfig.model_validate(config_dict) + + with pytest.raises(ValueError, match="cannot change discovered field path"): + config.to_index_schema(inspected) + + +def test_mcp_config_validates_runtime_mapping_against_effective_schema(): + config_dict = _valid_config() + config_dict["indexes"]["knowledge"]["runtime"]["vector_field_name"] = "content" + config = MCPConfig.model_validate(config_dict) + + with pytest.raises(ValueError, match="runtime.vector_field_name"): + config.to_index_schema(_inspected_schema()) + + +def test_load_mcp_config_requires_exactly_one_binding(tmp_path: Path): + config_path = tmp_path / "mcp.yaml" + config_path.write_text( + yaml.safe_dump( + { + "server": {"redis_url": "redis://localhost:6379"}, + "indexes": {}, + } + ), + encoding="utf-8", + ) + + with pytest.raises(ValueError, match="exactly one configured index binding"): + load_mcp_config(str(config_path)) + + +@pytest.mark.parametrize("search_type", ["vector", "fulltext", "hybrid"]) +def test_mcp_config_accepts_search_types(search_type): + config = _valid_config() + config["indexes"]["knowledge"]["search"] = {"type": search_type} + + loaded = MCPConfig.model_validate(config) + + assert loaded.binding.search.type == search_type + assert loaded.binding.search.params == {} + + +def test_mcp_config_requires_search_type(): + config = _valid_config() + del config["indexes"]["knowledge"]["search"]["type"] + + with pytest.raises(ValueError, match="type"): + MCPConfig.model_validate(config) + + +def test_mcp_config_rejects_invalid_search_type(): + config = _valid_config() + config["indexes"]["knowledge"]["search"] = {"type": "semantic"} + + with pytest.raises(ValueError, match="vector|fulltext|hybrid"): + MCPConfig.model_validate(config) + + +@pytest.mark.parametrize( + ("search_type", "params"), + [ + ("vector", {"text_scorer": "BM25STD"}), + ("fulltext", {"normalize_vector_distance": True}), + ("hybrid", {"normalize_vector_distance": True}), + ], +) +def test_mcp_config_rejects_invalid_search_params(search_type, params): + config = _valid_config() + config["indexes"]["knowledge"]["search"] = { + "type": search_type, + "params": params, + } + + with pytest.raises(ValueError, match="search.params"): + MCPConfig.model_validate(config) + + +def test_mcp_config_rejects_linear_text_weight_without_linear_combination(): + config = _valid_config() + config["indexes"]["knowledge"]["search"] = { + "type": "hybrid", + "params": { + "combination_method": "RRF", + "linear_text_weight": 0.3, + }, + } + + with pytest.raises(ValueError, match="linear_text_weight"): + MCPConfig.model_validate(config) + + +def test_mcp_config_normalizes_hybrid_linear_text_weight(): + config = _valid_config() + config["indexes"]["knowledge"]["search"] = { + "type": "hybrid", + "params": { + "combination_method": "LINEAR", + "linear_text_weight": 0.3, + }, + } + + loaded = MCPConfig.model_validate(config) + + assert loaded.binding.search.type == "hybrid" + assert loaded.binding.search.params["linear_text_weight"] == 0.3 diff --git a/tests/unit/test_mcp/test_errors.py b/tests/unit/test_mcp/test_errors.py new file mode 100644 index 00000000..ddd28622 --- /dev/null +++ b/tests/unit/test_mcp/test_errors.py @@ -0,0 +1,78 @@ +from pydantic import BaseModel, ValidationError +from redis.exceptions import ConnectionError as RedisConnectionError + +from redisvl.exceptions import RedisSearchError +from redisvl.mcp.errors import MCPErrorCode, RedisVLMCPError, map_exception + + +class SampleModel(BaseModel): + value: int + + +def test_validation_errors_map_to_invalid_request(): + try: + SampleModel.model_validate({"value": "bad"}) + except ValidationError as exc: + mapped = map_exception(exc) + + assert mapped.code == MCPErrorCode.INVALID_REQUEST + assert mapped.retryable is False + + +def test_import_error_maps_to_dependency_missing(): + mapped = map_exception(ImportError("missing package")) + + assert mapped.code == MCPErrorCode.DEPENDENCY_MISSING + assert mapped.retryable is False + + +def test_filter_error_is_preserved(): + original = RedisVLMCPError( + "bad filter", + code=MCPErrorCode.INVALID_FILTER, + retryable=False, + ) + + mapped = map_exception(original) + + assert mapped is original + + +def test_redis_errors_map_to_backend_unavailable(): + mapped = map_exception(RedisSearchError("redis unavailable")) + + assert mapped.code == MCPErrorCode.BACKEND_UNAVAILABLE + assert mapped.retryable is True + + +def test_redis_connection_errors_map_to_backend_unavailable(): + mapped = map_exception(RedisConnectionError("boom")) + + assert mapped.code == MCPErrorCode.BACKEND_UNAVAILABLE + assert mapped.retryable is True + + +def test_timeout_error_maps_to_backend_unavailable(): + mapped = map_exception(TimeoutError("timed out")) + + assert mapped.code == MCPErrorCode.BACKEND_UNAVAILABLE + assert mapped.retryable is True + + +def test_unknown_errors_map_to_internal_error(): + mapped = map_exception(RuntimeError("unexpected")) + + assert mapped.code == MCPErrorCode.INTERNAL_ERROR + assert mapped.retryable is False + + +def test_existing_framework_error_is_preserved(): + original = RedisVLMCPError( + "already mapped", + code=MCPErrorCode.INVALID_REQUEST, + retryable=False, + ) + + mapped = map_exception(original) + + assert mapped is original diff --git a/tests/unit/test_mcp/test_filters.py b/tests/unit/test_mcp/test_filters.py new file mode 100644 index 00000000..4fb43b6a --- /dev/null +++ b/tests/unit/test_mcp/test_filters.py @@ -0,0 +1,136 @@ +import pytest + +from redisvl.mcp.errors import MCPErrorCode, RedisVLMCPError +from redisvl.mcp.filters import parse_filter +from redisvl.query.filter import FilterExpression +from redisvl.schema import IndexSchema + + +def _schema() -> IndexSchema: + return IndexSchema.from_dict( + { + "index": { + "name": "docs-index", + "prefix": "doc", + "storage_type": "hash", + }, + "fields": [ + {"name": "content", "type": "text"}, + {"name": "category", "type": "tag"}, + {"name": "rating", "type": "numeric"}, + { + "name": "embedding", + "type": "vector", + "attrs": { + "algorithm": "flat", + "dims": 3, + "distance_metric": "cosine", + "datatype": "float32", + }, + }, + ], + } + ) + + +def _render_filter(value): + if isinstance(value, FilterExpression): + return str(value) + return value + + +def test_parse_filter_passes_through_raw_string(): + raw = "@category:{science} @rating:[4 +inf]" + + parsed = parse_filter(raw, _schema()) + + assert parsed == raw + + +def test_parse_filter_builds_atomic_expression(): + parsed = parse_filter( + {"field": "category", "op": "eq", "value": "science"}, + _schema(), + ) + + assert isinstance(parsed, FilterExpression) + assert str(parsed) == "@category:{science}" + + +def test_parse_filter_builds_nested_logical_expression(): + parsed = parse_filter( + { + "and": [ + {"field": "category", "op": "eq", "value": "science"}, + { + "or": [ + {"field": "rating", "op": "gte", "value": 4.5}, + {"field": "content", "op": "like", "value": "quant*"}, + ] + }, + ] + }, + _schema(), + ) + + assert isinstance(parsed, FilterExpression) + assert ( + str(parsed) == "(@category:{science} (@rating:[4.5 +inf] | @content:(quant*)))" + ) + + +def test_parse_filter_builds_not_expression(): + parsed = parse_filter( + { + "not": {"field": "category", "op": "eq", "value": "science"}, + }, + _schema(), + ) + + assert _render_filter(parsed) == "(-(@category:{science}))" + + +def test_parse_filter_builds_exists_expression(): + parsed = parse_filter( + {"field": "content", "op": "exists"}, + _schema(), + ) + + assert _render_filter(parsed) == "(-ismissing(@content))" + + +def test_parse_filter_rejects_unknown_field(): + with pytest.raises(RedisVLMCPError) as exc_info: + parse_filter({"field": "missing", "op": "eq", "value": "science"}, _schema()) + + assert exc_info.value.code == MCPErrorCode.INVALID_FILTER + + +def test_parse_filter_rejects_unknown_operator(): + with pytest.raises(RedisVLMCPError) as exc_info: + parse_filter( + {"field": "category", "op": "contains", "value": "science"}, _schema() + ) + + assert exc_info.value.code == MCPErrorCode.INVALID_FILTER + + +def test_parse_filter_rejects_type_mismatch(): + with pytest.raises(RedisVLMCPError) as exc_info: + parse_filter({"field": "rating", "op": "gte", "value": "high"}, _schema()) + + assert exc_info.value.code == MCPErrorCode.INVALID_FILTER + + +def test_parse_filter_rejects_empty_logical_array(): + with pytest.raises(RedisVLMCPError) as exc_info: + parse_filter({"and": []}, _schema()) + + assert exc_info.value.code == MCPErrorCode.INVALID_FILTER + + +def test_parse_filter_rejects_malformed_payload(): + with pytest.raises(RedisVLMCPError) as exc_info: + parse_filter({"field": "category", "value": "science"}, _schema()) + + assert exc_info.value.code == MCPErrorCode.INVALID_FILTER diff --git a/tests/unit/test_mcp/test_search_tool_unit.py b/tests/unit/test_mcp/test_search_tool_unit.py new file mode 100644 index 00000000..0afc37fa --- /dev/null +++ b/tests/unit/test_mcp/test_search_tool_unit.py @@ -0,0 +1,415 @@ +from types import SimpleNamespace +from typing import Optional + +import pytest + +from redisvl.mcp.config import MCPConfig +from redisvl.mcp.errors import MCPErrorCode, RedisVLMCPError +from redisvl.mcp.tools.search import _embed_query, register_search_tool, search_records +from redisvl.schema import IndexSchema + + +def _schema() -> IndexSchema: + return IndexSchema.from_dict( + { + "index": { + "name": "docs-index", + "prefix": "doc", + "storage_type": "hash", + }, + "fields": [ + {"name": "content", "type": "text"}, + {"name": "category", "type": "tag"}, + {"name": "rating", "type": "numeric"}, + { + "name": "embedding", + "type": "vector", + "attrs": { + "algorithm": "flat", + "dims": 3, + "distance_metric": "cosine", + "datatype": "float32", + }, + }, + ], + } + ) + + +def _config_with_search(search_type: str, params: Optional[dict] = None) -> MCPConfig: + return MCPConfig.model_validate( + { + "server": {"redis_url": "redis://localhost:6379"}, + "indexes": { + "knowledge": { + "redis_name": "docs-index", + "vectorizer": {"class": "FakeVectorizer", "model": "test-model"}, + "search": {"type": search_type, "params": params or {}}, + "runtime": { + "text_field_name": "content", + "vector_field_name": "embedding", + "default_embed_text_field": "content", + "default_limit": 2, + "max_limit": 5, + }, + } + }, + } + ) + + +class FakeVectorizer: + async def embed(self, text: str): + return [0.1, 0.2, 0.3] + + +class FakeIndex: + def __init__(self): + self.schema = _schema() + self.query_calls = [] + + async def query(self, query): + self.query_calls.append(query) + return [] + + +class FakeServer: + def __init__( + self, + *, + search_type: str = "vector", + search_params: Optional[dict] = None, + ): + self.config = _config_with_search(search_type, search_params) + self.mcp_settings = SimpleNamespace(tool_search_description=None) + self.index = FakeIndex() + self.vectorizer = FakeVectorizer() + self.registered_tools = [] + self.native_hybrid_supported = False + + async def get_index(self): + return self.index + + async def get_vectorizer(self): + return self.vectorizer + + async def run_guarded(self, operation_name, awaitable): + return await awaitable + + async def supports_native_hybrid_search(self): + return self.native_hybrid_supported + + def tool(self, name=None, description=None, **kwargs): + def decorator(fn): + self.registered_tools.append( + { + "name": name, + "description": description, + "fn": fn, + } + ) + return fn + + return decorator + + +class FakeQuery: + def __init__(self, **kwargs): + self.kwargs = kwargs + + +@pytest.mark.asyncio +async def test_embed_query_falls_back_to_sync_embed_when_aembed_is_not_implemented(): + class FallbackVectorizer: + async def aembed(self, text: str): + raise NotImplementedError + + def embed(self, text: str): + return [0.4, 0.5, 0.6] + + embedding = await _embed_query(FallbackVectorizer(), "science") + + assert embedding == [0.4, 0.5, 0.6] + + +@pytest.mark.asyncio +async def test_search_records_rejects_blank_query(): + server = FakeServer() + + with pytest.raises(RedisVLMCPError) as exc_info: + await search_records(server, query=" ") + + assert exc_info.value.code == MCPErrorCode.INVALID_REQUEST + + +@pytest.mark.asyncio +async def test_search_records_rejects_invalid_limit_and_offset(): + server = FakeServer() + + with pytest.raises(RedisVLMCPError) as limit_exc: + await search_records(server, query="science", limit=0) + + with pytest.raises(RedisVLMCPError) as offset_exc: + await search_records(server, query="science", offset=-1) + + assert limit_exc.value.code == MCPErrorCode.INVALID_REQUEST + assert offset_exc.value.code == MCPErrorCode.INVALID_REQUEST + + +@pytest.mark.asyncio +async def test_search_records_rejects_unknown_or_vector_return_fields(): + server = FakeServer() + + with pytest.raises(RedisVLMCPError) as unknown_exc: + await search_records(server, query="science", return_fields=["missing"]) + + with pytest.raises(RedisVLMCPError) as vector_exc: + await search_records(server, query="science", return_fields=["embedding"]) + + assert unknown_exc.value.code == MCPErrorCode.INVALID_REQUEST + assert vector_exc.value.code == MCPErrorCode.INVALID_REQUEST + + +@pytest.mark.asyncio +async def test_search_records_builds_vector_query_and_normalizes_results(monkeypatch): + server = FakeServer( + search_type="vector", + search_params={"normalize_vector_distance": False, "ef_runtime": 42}, + ) + built_queries = [] + + class FakeVectorQuery(FakeQuery): + def __init__(self, **kwargs): + built_queries.append(kwargs) + super().__init__(**kwargs) + + async def fake_query(query): + server.index.query_calls.append(query) + return [ + { + "id": "doc:1", + "content": "science doc", + "category": "science", + "vector_distance": "0.93", + } + ] + + monkeypatch.setattr("redisvl.mcp.tools.search.VectorQuery", FakeVectorQuery) + server.index.query = fake_query + + response = await search_records(server, query="science") + + assert built_queries[0]["vector"] == [0.1, 0.2, 0.3] + assert built_queries[0]["vector_field_name"] == "embedding" + assert built_queries[0]["return_fields"] == ["content", "category", "rating"] + assert built_queries[0]["num_results"] == 2 + assert built_queries[0]["normalize_vector_distance"] is False + assert built_queries[0]["ef_runtime"] == 42 + assert response == { + "search_type": "vector", + "offset": 0, + "limit": 2, + "results": [ + { + "id": "doc:1", + "score": 0.93, + "score_type": "vector_distance_normalized", + "record": { + "content": "science doc", + "category": "science", + }, + } + ], + } + + +@pytest.mark.asyncio +async def test_search_records_builds_fulltext_query(monkeypatch): + server = FakeServer( + search_type="fulltext", + search_params={ + "text_scorer": "BM25STD.NORM", + "stopwords": None, + "text_weights": {"medical": 2.5}, + }, + ) + built_queries = [] + + class FakeTextQuery(FakeQuery): + def __init__(self, **kwargs): + built_queries.append(kwargs) + super().__init__(**kwargs) + + async def fake_query(query): + server.index.query_calls.append(query) + return [ + { + "id": "doc:2", + "content": "medical science", + "category": "health", + "__score": "1.5", + } + ] + + monkeypatch.setattr("redisvl.mcp.tools.search.TextQuery", FakeTextQuery) + server.index.query = fake_query + + response = await search_records( + server, + query="medical science", + limit=1, + return_fields=["content", "category"], + ) + + assert built_queries[0]["text"] == "medical science" + assert built_queries[0]["text_field_name"] == "content" + assert built_queries[0]["num_results"] == 1 + assert built_queries[0]["text_scorer"] == "BM25STD.NORM" + assert built_queries[0]["stopwords"] is None + assert built_queries[0]["text_weights"] == {"medical": 2.5} + assert response["search_type"] == "fulltext" + assert response["results"][0]["score"] == 1.5 + assert response["results"][0]["score_type"] == "text_score" + + +@pytest.mark.asyncio +async def test_search_records_builds_hybrid_query_for_native_runtime(monkeypatch): + server = FakeServer( + search_type="hybrid", + search_params={ + "text_scorer": "TFIDF", + "stopwords": None, + "text_weights": {"hybrid": 2.0}, + "vector_search_method": "KNN", + "knn_ef_runtime": 77, + "combination_method": "LINEAR", + "linear_text_weight": 0.2, + }, + ) + server.native_hybrid_supported = True + built_queries = [] + + class FakePostProcessingConfig: + def __init__(self): + self.apply_calls = [] + + def apply(self, **kwargs): + self.apply_calls.append(kwargs) + + class FakeHybridQuery(FakeQuery): + def __init__(self, **kwargs): + self.postprocessing_config = FakePostProcessingConfig() + built_queries.append(("native", kwargs, self.postprocessing_config)) + super().__init__(**kwargs) + + class FakeAggregateHybridQuery(FakeQuery): + def __init__(self, **kwargs): + built_queries.append(("fallback", kwargs)) + super().__init__(**kwargs) + + async def fake_query(query): + server.index.query_calls.append(query) + return [ + { + "id": "doc:3", + "content": "hybrid doc", + "hybrid_score": "2.5", + } + ] + + monkeypatch.setattr("redisvl.mcp.tools.search.HybridQuery", FakeHybridQuery) + monkeypatch.setattr( + "redisvl.mcp.tools.search.AggregateHybridQuery", FakeAggregateHybridQuery + ) + server.index.query = fake_query + + response = await search_records(server, query="hybrid") + + assert built_queries[0][0] == "native" + assert built_queries[0][1]["vector"] == [0.1, 0.2, 0.3] + assert built_queries[0][1]["text_scorer"] == "TFIDF" + assert built_queries[0][1]["stopwords"] is None + assert built_queries[0][1]["text_weights"] == {"hybrid": 2.0} + assert built_queries[0][1]["vector_search_method"] == "KNN" + assert built_queries[0][1]["knn_ef_runtime"] == 77 + assert built_queries[0][1]["combination_method"] == "LINEAR" + assert built_queries[0][1]["linear_alpha"] == 0.2 + assert built_queries[0][2].apply_calls == [{"__key": "@__key"}] + assert response["search_type"] == "hybrid" + assert response["results"][0]["score_type"] == "hybrid_score" + assert response["results"][0]["score"] == 2.5 + + +@pytest.mark.asyncio +async def test_search_records_builds_hybrid_query_for_fallback_runtime(monkeypatch): + server = FakeServer( + search_type="hybrid", + search_params={ + "text_scorer": "TFIDF", + "stopwords": None, + "text_weights": {"hybrid": 2.0}, + "combination_method": "LINEAR", + "linear_text_weight": 0.2, + }, + ) + built_queries = [] + + class FakeHybridQuery(FakeQuery): + def __init__(self, **kwargs): + built_queries.append(("native", kwargs)) + super().__init__(**kwargs) + + class FakeAggregateHybridQuery(FakeQuery): + def __init__(self, **kwargs): + built_queries.append(("fallback", kwargs)) + super().__init__(**kwargs) + + async def fake_query(query): + server.index.query_calls.append(query) + return [ + { + "id": "doc:4", + "content": "fallback hybrid", + "hybrid_score": "0.7", + } + ] + + monkeypatch.setattr("redisvl.mcp.tools.search.HybridQuery", FakeHybridQuery) + monkeypatch.setattr( + "redisvl.mcp.tools.search.AggregateHybridQuery", FakeAggregateHybridQuery + ) + server.index.query = fake_query + + response = await search_records(server, query="hybrid") + + assert built_queries[0][0] == "fallback" + assert built_queries[0][1]["text_scorer"] == "TFIDF" + assert built_queries[0][1]["stopwords"] is None + assert built_queries[0][1]["text_weights"] == {"hybrid": 2.0} + assert built_queries[0][1]["alpha"] == pytest.approx(0.8) + assert built_queries[0][1]["return_fields"] == [ + "__key", + "content", + "category", + "rating", + ] + assert response["search_type"] == "hybrid" + assert response["results"][0]["score"] == 0.7 + + +def test_register_search_tool_uses_default_and_override_descriptions(): + default_server = FakeServer() + register_search_tool(default_server) + + assert default_server.registered_tools[0]["name"] == "search-records" + assert "Search records" in default_server.registered_tools[0]["description"] + assert "query" in default_server.registered_tools[0]["fn"].__annotations__ + assert "search_type" not in default_server.registered_tools[0]["fn"].__annotations__ + + custom_server = FakeServer() + custom_server.mcp_settings.tool_search_description = "Custom search description" + register_search_tool(custom_server) + + assert ( + custom_server.registered_tools[0]["description"] == "Custom search description" + ) diff --git a/tests/unit/test_mcp/test_settings.py b/tests/unit/test_mcp/test_settings.py new file mode 100644 index 00000000..cf4b8800 --- /dev/null +++ b/tests/unit/test_mcp/test_settings.py @@ -0,0 +1,45 @@ +from pydantic_settings import BaseSettings + +from redisvl.mcp.settings import MCPSettings + + +def test_settings_reads_env_defaults(monkeypatch): + monkeypatch.setenv("REDISVL_MCP_CONFIG", "/tmp/mcp.yaml") + monkeypatch.setenv("REDISVL_MCP_READ_ONLY", "true") + monkeypatch.setenv("REDISVL_MCP_TOOL_SEARCH_DESCRIPTION", "search docs") + monkeypatch.setenv("REDISVL_MCP_TOOL_UPSERT_DESCRIPTION", "upsert docs") + + settings = MCPSettings() + + assert settings.config == "/tmp/mcp.yaml" + assert settings.read_only is True + assert settings.tool_search_description == "search docs" + assert settings.tool_upsert_description == "upsert docs" + + +def test_settings_explicit_values_override_env(monkeypatch): + monkeypatch.setenv("REDISVL_MCP_CONFIG", "/tmp/from-env.yaml") + monkeypatch.setenv("REDISVL_MCP_READ_ONLY", "true") + + settings = MCPSettings.from_env( + config="/tmp/from-arg.yaml", + read_only=False, + ) + + assert settings.config == "/tmp/from-arg.yaml" + assert settings.read_only is False + + +def test_settings_defaults_optional_descriptions(monkeypatch): + monkeypatch.delenv("REDISVL_MCP_TOOL_SEARCH_DESCRIPTION", raising=False) + monkeypatch.delenv("REDISVL_MCP_TOOL_UPSERT_DESCRIPTION", raising=False) + monkeypatch.setenv("REDISVL_MCP_CONFIG", "/tmp/mcp.yaml") + + settings = MCPSettings.from_env() + + assert settings.tool_search_description is None + assert settings.tool_upsert_description is None + + +def test_settings_uses_pydantic_base_settings(): + assert issubclass(MCPSettings, BaseSettings) diff --git a/tests/unit/test_mcp/test_upsert_tool_unit.py b/tests/unit/test_mcp/test_upsert_tool_unit.py new file mode 100644 index 00000000..8bb59ce0 --- /dev/null +++ b/tests/unit/test_mcp/test_upsert_tool_unit.py @@ -0,0 +1,359 @@ +from types import SimpleNamespace +from typing import Any, List, Optional + +import pytest +from redis.exceptions import RedisError + +from redisvl.mcp.config import MCPConfig +from redisvl.mcp.errors import MCPErrorCode, RedisVLMCPError +from redisvl.mcp.tools.upsert import register_upsert_tool, upsert_records +from redisvl.redis.utils import array_to_buffer +from redisvl.schema import IndexSchema + + +def _schema(storage_type: str = "hash") -> IndexSchema: + return IndexSchema.from_dict( + { + "index": { + "name": "docs-index", + "prefix": "doc", + "storage_type": storage_type, + }, + "fields": [ + {"name": "content", "type": "text"}, + {"name": "category", "type": "tag"}, + { + "name": "embedding", + "type": "vector", + "attrs": { + "algorithm": "flat", + "dims": 3, + "distance_metric": "cosine", + "datatype": "float32", + }, + }, + ], + } + ) + + +def _config( + storage_type: str = "hash", + *, + max_upsert_records: int = 5, + skip_embedding_if_present: bool = True, +) -> MCPConfig: + return MCPConfig.model_validate( + { + "server": {"redis_url": "redis://localhost:6379"}, + "indexes": { + "knowledge": { + "redis_name": "docs-index", + "vectorizer": {"class": "FakeVectorizer", "model": "test-model"}, + "search": {"type": "vector"}, + "runtime": { + "text_field_name": "content", + "vector_field_name": "embedding", + "default_embed_text_field": "content", + "default_limit": 2, + "max_limit": 5, + "max_upsert_records": max_upsert_records, + "skip_embedding_if_present": skip_embedding_if_present, + }, + } + }, + } + ) + + +class FakeVectorizer: + def __init__(self): + self.aembed_many_calls = [] + self.embed_many_calls = [] + self.aembed_calls = [] + self.embed_calls = [] + + async def aembed_many(self, contents: List[str], **kwargs): + self.aembed_many_calls.append((contents, kwargs)) + return [ + [float(index), float(index), float(index)] + for index, _ in enumerate(contents, start=1) + ] + + def embed_many(self, contents: List[str], **kwargs): + self.embed_many_calls.append((contents, kwargs)) + return [[9.0, 9.0, 9.0] for _ in contents] + + async def aembed(self, content: str, **kwargs): + self.aembed_calls.append((content, kwargs)) + return [8.0, 8.0, 8.0] + + def embed(self, content: str, **kwargs): + self.embed_calls.append((content, kwargs)) + return [7.0, 7.0, 7.0] + + +class FallbackBatchVectorizer(FakeVectorizer): + async def aembed_many(self, contents: List[str], **kwargs): + raise NotImplementedError + + +class FakeIndex: + def __init__(self, storage_type: str = "hash"): + self.schema = _schema(storage_type) + self.load_calls = [] + self.keys_to_return = ["doc:1"] + self.load_exception = None + + async def load(self, data, id_field=None, **kwargs): + materialized = list(data) + self.load_calls.append( + { + "data": materialized, + "id_field": id_field, + "kwargs": kwargs, + } + ) + if self.load_exception is not None: + raise self.load_exception + return self.keys_to_return + + +class FakeServer: + def __init__( + self, + *, + storage_type: str = "hash", + max_upsert_records: int = 5, + skip_embedding_if_present: bool = True, + vectorizer: Optional[FakeVectorizer] = None, + ): + self.config = _config( + storage_type, + max_upsert_records=max_upsert_records, + skip_embedding_if_present=skip_embedding_if_present, + ) + self.mcp_settings = SimpleNamespace(tool_upsert_description=None) + self.index = FakeIndex(storage_type) + self.vectorizer = vectorizer or FakeVectorizer() + self.registered_tools = [] + + async def get_index(self): + return self.index + + async def get_vectorizer(self): + return self.vectorizer + + async def run_guarded(self, operation_name: str, awaitable: Any): + return await awaitable + + def tool(self, name=None, description=None, **kwargs): + def decorator(fn): + self.registered_tools.append( + { + "name": name, + "description": description, + "fn": fn, + } + ) + return fn + + return decorator + + +@pytest.mark.asyncio +async def test_upsert_records_generates_missing_vectors_and_serializes_hash_vectors(): + server = FakeServer(storage_type="hash") + server.index.keys_to_return = ["doc:alpha", "doc:beta"] + + response = await upsert_records( + server, + records=[ + {"id": "alpha", "content": "alpha doc", "category": "science"}, + {"id": "beta", "content": "beta doc", "category": "health"}, + ], + id_field="id", + ) + + assert response == { + "status": "success", + "keys_upserted": 2, + "keys": ["doc:alpha", "doc:beta"], + } + assert server.vectorizer.aembed_many_calls == [(["alpha doc", "beta doc"], {})] + assert len(server.index.load_calls) == 1 + loaded_records = server.index.load_calls[0]["data"] + assert loaded_records[0]["embedding"] == array_to_buffer([1.0, 1.0, 1.0], "float32") + assert loaded_records[1]["embedding"] == array_to_buffer([2.0, 2.0, 2.0], "float32") + assert server.index.load_calls[0]["id_field"] == "id" + + +@pytest.mark.asyncio +async def test_upsert_records_preserves_supplied_vectors_when_skip_embedding_if_present(): + server = FakeServer(storage_type="hash", skip_embedding_if_present=True) + + existing_vector = [0.1, 0.2, 0.3] + await upsert_records( + server, + records=[ + {"id": "alpha", "content": "alpha doc", "embedding": existing_vector}, + {"id": "beta", "content": "beta doc"}, + ], + id_field="id", + ) + + loaded_records = server.index.load_calls[0]["data"] + assert loaded_records[0]["embedding"] == array_to_buffer(existing_vector, "float32") + assert loaded_records[1]["embedding"] == array_to_buffer([1.0, 1.0, 1.0], "float32") + assert server.vectorizer.aembed_many_calls == [(["beta doc"], {})] + + +@pytest.mark.asyncio +async def test_upsert_records_rejects_invalid_hash_vector_dimensions_before_serializing(): + server = FakeServer(storage_type="hash", skip_embedding_if_present=True) + + with pytest.raises( + RedisVLMCPError, match="must have 3 dimensions, got 2" + ) as exc_info: + await upsert_records( + server, + records=[{"id": "alpha", "content": "alpha doc", "embedding": [0.1, 0.2]}], + id_field="id", + ) + + assert exc_info.value.code == MCPErrorCode.INVALID_REQUEST + assert server.index.load_calls == [] + assert server.vectorizer.aembed_many_calls == [] + + +@pytest.mark.asyncio +async def test_upsert_records_overwrites_supplied_vectors_when_skip_embedding_if_present_false(): + server = FakeServer(storage_type="hash", skip_embedding_if_present=True) + + await upsert_records( + server, + records=[{"id": "alpha", "content": "alpha doc", "embedding": [0.1, 0.2, 0.3]}], + id_field="id", + skip_embedding_if_present=False, + ) + + loaded_record = server.index.load_calls[0]["data"][0] + assert loaded_record["embedding"] == array_to_buffer([1.0, 1.0, 1.0], "float32") + assert server.vectorizer.aembed_many_calls == [(["alpha doc"], {})] + + +@pytest.mark.asyncio +async def test_upsert_records_uses_batch_fallback_when_aembed_many_is_not_implemented(): + server = FakeServer(vectorizer=FallbackBatchVectorizer()) + + await upsert_records( + server, + records=[{"content": "alpha doc"}], + ) + + loaded_record = server.index.load_calls[0]["data"][0] + assert loaded_record["embedding"] == array_to_buffer([9.0, 9.0, 9.0], "float32") + assert server.vectorizer.embed_many_calls == [(["alpha doc"], {})] + + +@pytest.mark.asyncio +@pytest.mark.parametrize( + ("records", "id_field", "message"), + [ + ([], None, "records must be a non-empty list"), + ("bad", None, "records must be a non-empty list"), + ([1], None, "records must contain only objects"), + ([{"content": "alpha"}], "id", "id_field 'id' must exist"), + ], +) +async def test_upsert_records_rejects_invalid_request_shapes( + records, id_field, message +): + server = FakeServer() + + with pytest.raises(RedisVLMCPError, match=message) as exc_info: + await upsert_records(server, records=records, id_field=id_field) + + assert exc_info.value.code == MCPErrorCode.INVALID_REQUEST + + +@pytest.mark.asyncio +async def test_upsert_records_rejects_batches_above_runtime_limit(): + server = FakeServer(max_upsert_records=1) + + with pytest.raises( + RedisVLMCPError, match="must be less than or equal to 1" + ) as exc_info: + await upsert_records( + server, + records=[{"content": "alpha"}, {"content": "beta"}], + ) + + assert exc_info.value.code == MCPErrorCode.INVALID_REQUEST + + +@pytest.mark.asyncio +async def test_upsert_records_requires_configured_embed_source_when_embedding_needed(): + server = FakeServer() + + with pytest.raises(RedisVLMCPError, match="content") as exc_info: + await upsert_records( + server, + records=[{"category": "science"}], + ) + + assert exc_info.value.code == MCPErrorCode.INVALID_REQUEST + + +@pytest.mark.asyncio +async def test_upsert_records_validates_non_vector_fields_before_embedding(): + server = FakeServer() + + with pytest.raises(RedisVLMCPError, match="category") as exc_info: + await upsert_records( + server, + records=[{"content": "alpha doc", "category": ["science"]}], + ) + + assert exc_info.value.code == MCPErrorCode.INVALID_REQUEST + assert server.vectorizer.aembed_many_calls == [] + assert server.index.load_calls == [] + + +@pytest.mark.asyncio +async def test_upsert_records_surfaces_partial_write_possible_on_backend_failures(): + server = FakeServer() + server.index.load_exception = RedisError("boom") + + with pytest.raises(RedisVLMCPError) as exc_info: + await upsert_records(server, records=[{"content": "alpha doc"}]) + + assert exc_info.value.code == MCPErrorCode.BACKEND_UNAVAILABLE + assert exc_info.value.metadata["partial_write_possible"] is True + + +def test_register_upsert_tool_uses_default_and_override_descriptions(): + default_server = FakeServer() + register_upsert_tool(default_server) + + assert default_server.registered_tools[0]["name"] == "upsert-records" + assert "Upsert records" in default_server.registered_tools[0]["description"] + + custom_server = FakeServer() + custom_server.mcp_settings.tool_upsert_description = "Custom upsert description" + register_upsert_tool(custom_server) + + assert ( + custom_server.registered_tools[0]["description"] == "Custom upsert description" + ) + + +@pytest.mark.asyncio +async def test_registered_upsert_tool_rejects_deprecated_embed_text_field_argument(): + server = FakeServer() + register_upsert_tool(server) + + tool_fn = server.registered_tools[0]["fn"] + + with pytest.raises(TypeError): + await tool_fn(records=[{"content": "alpha doc"}], embed_text_field="content") diff --git a/uv.lock b/uv.lock index d7ff8293..8b3cc650 100644 --- a/uv.lock +++ b/uv.lock @@ -22,6 +22,18 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/8d/3f/95338030883d8c8b91223b4e21744b04d11b161a3ef117295d8241f50ab4/accessible_pygments-0.0.5-py3-none-any.whl", hash = "sha256:88ae3211e68a1d0b011504b2ffc1691feafce124b845bd072ab6f9f66f34d4b7", size = 1395903, upload-time = "2024-05-10T11:23:08.421Z" }, ] +[[package]] +name = "aiofile" +version = "3.9.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "caio", marker = "python_full_version >= '3.10'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/67/e2/d7cb819de8df6b5c1968a2756c3cb4122d4fa2b8fc768b53b7c9e5edb646/aiofile-3.9.0.tar.gz", hash = "sha256:e5ad718bb148b265b6df1b3752c4d1d83024b93da9bd599df74b9d9ffcf7919b", size = 17943, upload-time = "2024-10-08T10:39:35.846Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/50/25/da1f0b4dd970e52bf5a36c204c107e11a0c6d3ed195eba0bfbc664c312b2/aiofile-3.9.0-py3-none-any.whl", hash = "sha256:ce2f6c1571538cbdfa0143b04e16b208ecb0e9cb4148e528af8a640ed51cc8aa", size = 19539, upload-time = "2024-10-08T10:39:32.955Z" }, +] + [[package]] name = "aiohappyeyeballs" version = "2.6.1" @@ -271,6 +283,18 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/3a/2a/7cc015f5b9f5db42b7d48157e23356022889fc354a2813c15934b7cb5c0e/attrs-25.4.0-py3-none-any.whl", hash = "sha256:adcf7e2a1fb3b36ac48d97835bb6d8ade15b8dcce26aba8bf1d14847b57a3373", size = 67615, upload-time = "2025-10-06T13:54:43.17Z" }, ] +[[package]] +name = "authlib" +version = "1.6.9" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "cryptography", marker = "python_full_version >= '3.10'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/af/98/00d3dd826d46959ad8e32af2dbb2398868fd9fd0683c26e56d0789bd0e68/authlib-1.6.9.tar.gz", hash = "sha256:d8f2421e7e5980cc1ddb4e32d3f5fa659cfaf60d8eaf3281ebed192e4ab74f04", size = 165134, upload-time = "2026-03-02T07:44:01.998Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/53/23/b65f568ed0c22f1efacb744d2db1a33c8068f384b8c9b482b52ebdbc3ef6/authlib-1.6.9-py2.py3-none-any.whl", hash = "sha256:f08b4c14e08f0861dc18a32357b33fbcfd2ea86cfe3fe149484b4d764c4a0ac3", size = 244197, upload-time = "2026-03-02T07:44:00.307Z" }, +] + [[package]] name = "babel" version = "2.17.0" @@ -280,6 +304,24 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/b7/b8/3fe70c75fe32afc4bb507f75563d39bc5642255d1d94f1f23604725780bf/babel-2.17.0-py3-none-any.whl", hash = "sha256:4d0b53093fdfb4b21c92b5213dba5a1b23885afa8383709427046b21c366e5f2", size = 10182537, upload-time = "2025-02-01T15:17:37.39Z" }, ] +[[package]] +name = "backports-tarfile" +version = "1.2.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/86/72/cd9b395f25e290e633655a100af28cb253e4393396264a98bd5f5951d50f/backports_tarfile-1.2.0.tar.gz", hash = "sha256:d75e02c268746e1b8144c278978b6e98e85de6ad16f8e4b0844a154557eca991", size = 86406, upload-time = "2024-05-28T17:01:54.731Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b9/fa/123043af240e49752f1c4bd24da5053b6bd00cad78c2be53c0d1e8b975bc/backports.tarfile-1.2.0-py3-none-any.whl", hash = "sha256:77e284d754527b01fb1e6fa8a1afe577858ebe4e9dad8919e34c862cb399bc34", size = 30181, upload-time = "2024-05-28T17:01:53.112Z" }, +] + +[[package]] +name = "beartype" +version = "0.22.9" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/c7/94/1009e248bbfbab11397abca7193bea6626806be9a327d399810d523a07cb/beartype-0.22.9.tar.gz", hash = "sha256:8f82b54aa723a2848a56008d18875f91c1db02c32ef6a62319a002e3e25a975f", size = 1608866, upload-time = "2025-12-13T06:50:30.72Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/71/cc/18245721fa7747065ab478316c7fea7c74777d07f37ae60db2e84f8172e8/beartype-0.22.9-py3-none-any.whl", hash = "sha256:d16c9bbc61ea14637596c5f6fbff2ee99cbe3573e46a716401734ef50c3060c2", size = 1333658, upload-time = "2025-12-13T06:50:28.266Z" }, +] + [[package]] name = "beautifulsoup4" version = "4.14.2" @@ -389,6 +431,35 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/96/c5/1e741d26306c42e2bf6ab740b2202872727e0f606033c9dd713f8b93f5a8/cachetools-6.2.1-py3-none-any.whl", hash = "sha256:09868944b6dde876dfd44e1d47e18484541eaf12f26f29b7af91b26cc892d701", size = 11280, upload-time = "2025-10-12T14:55:28.382Z" }, ] +[[package]] +name = "caio" +version = "0.9.25" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/92/88/b8527e1b00c1811db339a1df8bd1ae49d146fcea9d6a5c40e3a80aaeb38d/caio-0.9.25.tar.gz", hash = "sha256:16498e7f81d1d0f5a4c0ad3f2540e65fe25691376e0a5bd367f558067113ed10", size = 26781, upload-time = "2025-12-26T15:21:36.501Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/6a/80/ea4ead0c5d52a9828692e7df20f0eafe8d26e671ce4883a0a146bb91049e/caio-0.9.25-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:ca6c8ecda611478b6016cb94d23fd3eb7124852b985bdec7ecaad9f3116b9619", size = 36836, upload-time = "2025-12-26T15:22:04.662Z" }, + { url = "https://files.pythonhosted.org/packages/17/b9/36715c97c873649d1029001578f901b50250916295e3dddf20c865438865/caio-0.9.25-cp310-cp310-manylinux2010_x86_64.manylinux2014_x86_64.manylinux_2_12_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:db9b5681e4af8176159f0d6598e73b2279bb661e718c7ac23342c550bd78c241", size = 79695, upload-time = "2025-12-26T15:22:18.818Z" }, + { url = "https://files.pythonhosted.org/packages/0b/ab/07080ecb1adb55a02cbd8ec0126aa8e43af343ffabb6a71125b42670e9a1/caio-0.9.25-cp310-cp310-manylinux_2_34_aarch64.whl", hash = "sha256:bf61d7d0c4fd10ffdd98ca47f7e8db4d7408e74649ffaf4bef40b029ada3c21b", size = 79457, upload-time = "2026-03-04T22:08:16.024Z" }, + { url = "https://files.pythonhosted.org/packages/88/95/dd55757bb671eb4c376e006c04e83beb413486821f517792ea603ef216e9/caio-0.9.25-cp310-cp310-manylinux_2_34_x86_64.whl", hash = "sha256:ab52e5b643f8bbd64a0605d9412796cd3464cb8ca88593b13e95a0f0b10508ae", size = 77705, upload-time = "2026-03-04T22:08:17.202Z" }, + { url = "https://files.pythonhosted.org/packages/ec/90/543f556fcfcfa270713eef906b6352ab048e1e557afec12925c991dc93c2/caio-0.9.25-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:d6956d9e4a27021c8bd6c9677f3a59eb1d820cc32d0343cea7961a03b1371965", size = 36839, upload-time = "2025-12-26T15:21:40.267Z" }, + { url = "https://files.pythonhosted.org/packages/51/3b/36f3e8ec38dafe8de4831decd2e44c69303d2a3892d16ceda42afed44e1b/caio-0.9.25-cp311-cp311-manylinux2010_x86_64.manylinux2014_x86_64.manylinux_2_12_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:bf84bfa039f25ad91f4f52944452a5f6f405e8afab4d445450978cd6241d1478", size = 80255, upload-time = "2025-12-26T15:22:20.271Z" }, + { url = "https://files.pythonhosted.org/packages/df/ce/65e64867d928e6aff1b4f0e12dba0ef6d5bf412c240dc1df9d421ac10573/caio-0.9.25-cp311-cp311-manylinux_2_34_aarch64.whl", hash = "sha256:ae3d62587332bce600f861a8de6256b1014d6485cfd25d68c15caf1611dd1f7c", size = 80052, upload-time = "2026-03-04T22:08:20.402Z" }, + { url = "https://files.pythonhosted.org/packages/46/90/e278863c47e14ec58309aa2e38a45882fbe67b4cc29ec9bc8f65852d3e45/caio-0.9.25-cp311-cp311-manylinux_2_34_x86_64.whl", hash = "sha256:fc220b8533dcf0f238a6b1a4a937f92024c71e7b10b5a2dfc1c73604a25709bc", size = 78273, upload-time = "2026-03-04T22:08:21.368Z" }, + { url = "https://files.pythonhosted.org/packages/d3/25/79c98ebe12df31548ba4eaf44db11b7cad6b3e7b4203718335620939083c/caio-0.9.25-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:fb7ff95af4c31ad3f03179149aab61097a71fd85e05f89b4786de0359dffd044", size = 36983, upload-time = "2025-12-26T15:21:36.075Z" }, + { url = "https://files.pythonhosted.org/packages/a3/2b/21288691f16d479945968a0a4f2856818c1c5be56881d51d4dac9b255d26/caio-0.9.25-cp312-cp312-manylinux2010_x86_64.manylinux2014_x86_64.manylinux_2_12_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:97084e4e30dfa598449d874c4d8e0c8d5ea17d2f752ef5e48e150ff9d240cd64", size = 82012, upload-time = "2025-12-26T15:22:20.983Z" }, + { url = "https://files.pythonhosted.org/packages/03/c4/8a1b580875303500a9c12b9e0af58cb82e47f5bcf888c2457742a138273c/caio-0.9.25-cp312-cp312-manylinux_2_34_aarch64.whl", hash = "sha256:4fa69eba47e0f041b9d4f336e2ad40740681c43e686b18b191b6c5f4c5544bfb", size = 81502, upload-time = "2026-03-04T22:08:22.381Z" }, + { url = "https://files.pythonhosted.org/packages/d1/1c/0fe770b8ffc8362c48134d1592d653a81a3d8748d764bec33864db36319d/caio-0.9.25-cp312-cp312-manylinux_2_34_x86_64.whl", hash = "sha256:6bebf6f079f1341d19f7386db9b8b1f07e8cc15ae13bfdaff573371ba0575d69", size = 80200, upload-time = "2026-03-04T22:08:23.382Z" }, + { url = "https://files.pythonhosted.org/packages/31/57/5e6ff127e6f62c9f15d989560435c642144aa4210882f9494204bc892305/caio-0.9.25-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:d6c2a3411af97762a2b03840c3cec2f7f728921ff8adda53d7ea2315a8563451", size = 36979, upload-time = "2025-12-26T15:21:35.484Z" }, + { url = "https://files.pythonhosted.org/packages/a3/9f/f21af50e72117eb528c422d4276cbac11fb941b1b812b182e0a9c70d19c5/caio-0.9.25-cp313-cp313-manylinux2010_x86_64.manylinux2014_x86_64.manylinux_2_12_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:0998210a4d5cd5cb565b32ccfe4e53d67303f868a76f212e002a8554692870e6", size = 81900, upload-time = "2025-12-26T15:22:21.919Z" }, + { url = "https://files.pythonhosted.org/packages/9c/12/c39ae2a4037cb10ad5eb3578eb4d5f8c1a2575c62bba675f3406b7ef0824/caio-0.9.25-cp313-cp313-manylinux_2_34_aarch64.whl", hash = "sha256:1a177d4777141b96f175fe2c37a3d96dec7911ed9ad5f02bac38aaa1c936611f", size = 81523, upload-time = "2026-03-04T22:08:25.187Z" }, + { url = "https://files.pythonhosted.org/packages/22/59/f8f2e950eb4f1a5a3883e198dca514b9d475415cb6cd7b78b9213a0dd45a/caio-0.9.25-cp313-cp313-manylinux_2_34_x86_64.whl", hash = "sha256:9ed3cfb28c0e99fec5e208c934e5c157d0866aa9c32aa4dc5e9b6034af6286b7", size = 80243, upload-time = "2026-03-04T22:08:26.449Z" }, + { url = "https://files.pythonhosted.org/packages/69/ca/a08fdc7efdcc24e6a6131a93c85be1f204d41c58f474c42b0670af8c016b/caio-0.9.25-cp314-cp314-macosx_10_15_universal2.whl", hash = "sha256:fab6078b9348e883c80a5e14b382e6ad6aabbc4429ca034e76e730cf464269db", size = 36978, upload-time = "2025-12-26T15:21:41.055Z" }, + { url = "https://files.pythonhosted.org/packages/5e/6c/d4d24f65e690213c097174d26eda6831f45f4734d9d036d81790a27e7b78/caio-0.9.25-cp314-cp314-manylinux2010_x86_64.manylinux2014_x86_64.manylinux_2_12_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:44a6b58e52d488c75cfaa5ecaa404b2b41cc965e6c417e03251e868ecd5b6d77", size = 81832, upload-time = "2025-12-26T15:22:22.757Z" }, + { url = "https://files.pythonhosted.org/packages/87/a4/e534cf7d2d0e8d880e25dd61e8d921ffcfe15bd696734589826f5a2df727/caio-0.9.25-cp314-cp314-manylinux_2_34_aarch64.whl", hash = "sha256:628a630eb7fb22381dd8e3c8ab7f59e854b9c806639811fc3f4310c6bd711d79", size = 81565, upload-time = "2026-03-04T22:08:27.483Z" }, + { url = "https://files.pythonhosted.org/packages/3f/ed/bf81aeac1d290017e5e5ac3e880fd56ee15e50a6d0353986799d1bc5cfd5/caio-0.9.25-cp314-cp314-manylinux_2_34_x86_64.whl", hash = "sha256:0ba16aa605ccb174665357fc729cf500679c2d94d5f1458a6f0d5ca48f2060a7", size = 80071, upload-time = "2026-03-04T22:08:28.751Z" }, + { url = "https://files.pythonhosted.org/packages/86/93/1f76c8d1bafe3b0614e06b2195784a3765bbf7b0a067661af9e2dd47fc33/caio-0.9.25-py3-none-any.whl", hash = "sha256:06c0bb02d6b929119b1cfbe1ca403c768b2013a369e2db46bfa2a5761cf82e40", size = 19087, upload-time = "2025-12-26T15:22:00.221Z" }, +] + [[package]] name = "certifi" version = "2025.10.5" @@ -973,6 +1044,23 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/0d/c3/e90f4a4feae6410f914f8ebac129b9ae7a8c92eb60a638012dde42030a9d/cryptography-46.0.3-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:6b5063083824e5509fdba180721d55909ffacccc8adbec85268b48439423d78c", size = 3438528, upload-time = "2025-10-15T23:18:26.227Z" }, ] +[[package]] +name = "cyclopts" +version = "4.10.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "attrs", marker = "python_full_version >= '3.10'" }, + { name = "docstring-parser", marker = "python_full_version >= '3.10'" }, + { name = "rich", marker = "python_full_version >= '3.10'" }, + { name = "rich-rst", marker = "python_full_version >= '3.10'" }, + { name = "tomli", marker = "python_full_version == '3.10.*'" }, + { name = "typing-extensions", marker = "python_full_version == '3.10.*'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/6c/c4/2ce2ca1451487dc7d59f09334c3fa1182c46cfcf0a2d5f19f9b26d53ac74/cyclopts-4.10.1.tar.gz", hash = "sha256:ad4e4bb90576412d32276b14a76f55d43353753d16217f2c3cd5bdceba7f15a0", size = 166623, upload-time = "2026-03-23T14:43:01.098Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/8a/0b/2261922126b2e50c601fe22d7ff5194e0a4d50e654836260c0665e24d862/cyclopts-4.10.1-py3-none-any.whl", hash = "sha256:35f37257139380a386d9fe4475e1e7c87ca7795765ef4f31abba579fcfcb6ecd", size = 204331, upload-time = "2026-03-23T14:43:02.625Z" }, +] + [[package]] name = "debugpy" version = "1.8.17" @@ -1051,6 +1139,15 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/12/b3/231ffd4ab1fc9d679809f356cebee130ac7daa00d6d6f3206dd4fd137e9e/distro-1.9.0-py3-none-any.whl", hash = "sha256:7bffd925d65168f85027d8da9af6bddab658135b840670a223589bc0c8ef02b2", size = 20277, upload-time = "2023-12-24T09:54:30.421Z" }, ] +[[package]] +name = "dnspython" +version = "2.8.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/8c/8b/57666417c0f90f08bcafa776861060426765fdb422eb10212086fb811d26/dnspython-2.8.0.tar.gz", hash = "sha256:181d3c6996452cb1189c4046c61599b84a5a86e099562ffde77d26984ff26d0f", size = 368251, upload-time = "2025-09-07T18:58:00.022Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ba/5a/18ad964b0086c6e62e2e7500f7edc89e3faa45033c71c1893d34eed2b2de/dnspython-2.8.0-py3-none-any.whl", hash = "sha256:01d9bbc4a2d76bf0db7c1f729812ded6d912bd318d3b1cf81d30c0f845dbf3af", size = 331094, upload-time = "2025-09-07T18:57:58.071Z" }, +] + [[package]] name = "docker" version = "7.1.0" @@ -1084,6 +1181,19 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/8f/d7/9322c609343d929e75e7e5e6255e614fcc67572cfd083959cdef3b7aad79/docutils-0.21.2-py3-none-any.whl", hash = "sha256:dafca5b9e384f0e419294eb4d2ff9fa826435bf15f15b7bd45723e8ad76811b2", size = 587408, upload-time = "2024-04-23T18:57:14.835Z" }, ] +[[package]] +name = "email-validator" +version = "2.3.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "dnspython", marker = "python_full_version >= '3.10'" }, + { name = "idna", marker = "python_full_version >= '3.10'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/f5/22/900cb125c76b7aaa450ce02fd727f452243f2e91a61af068b40adba60ea9/email_validator-2.3.0.tar.gz", hash = "sha256:9fc05c37f2f6cf439ff414f8fc46d917929974a82244c20eb10231ba60c54426", size = 51238, upload-time = "2025-08-26T13:09:06.831Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/de/15/545e2b6cf2e3be84bc1ed85613edd75b8aea69807a71c26f4ca6a9258e82/email_validator-2.3.0-py3-none-any.whl", hash = "sha256:80f13f623413e6b197ae73bb10bf4eb0908faf509ad8362c5edeb0be7fd450b4", size = 35604, upload-time = "2025-08-26T13:09:05.858Z" }, +] + [[package]] name = "eval-type-backport" version = "0.2.2" @@ -1098,7 +1208,7 @@ name = "exceptiongroup" version = "1.3.0" source = { registry = "https://pypi.org/simple" } dependencies = [ - { name = "typing-extensions", marker = "python_full_version < '3.11'" }, + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, ] sdist = { url = "https://files.pythonhosted.org/packages/0b/9f/a65090624ecf468cdca03533906e7c69ed7588582240cfe7cc9e770b50eb/exceptiongroup-1.3.0.tar.gz", hash = "sha256:b241f5885f560bc56a59ee63ca4c6a8bfa46ae4ad651af316d4e81817bb9fd88", size = 29749, upload-time = "2025-05-10T17:42:51.123Z" } wheels = [ @@ -1185,6 +1295,38 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/cb/a8/20d0723294217e47de6d9e2e40fd4a9d2f7c4b6ef974babd482a59743694/fastjsonschema-2.21.2-py3-none-any.whl", hash = "sha256:1c797122d0a86c5cace2e54bf4e819c36223b552017172f32c5c024a6b77e463", size = 24024, upload-time = "2025-08-14T18:49:34.776Z" }, ] +[[package]] +name = "fastmcp" +version = "3.1.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "authlib", marker = "python_full_version >= '3.10'" }, + { name = "cyclopts", marker = "python_full_version >= '3.10'" }, + { name = "exceptiongroup", marker = "python_full_version >= '3.10'" }, + { name = "httpx", marker = "python_full_version >= '3.10'" }, + { name = "jsonref", marker = "python_full_version >= '3.10'" }, + { name = "jsonschema-path", marker = "python_full_version >= '3.10'" }, + { name = "mcp", marker = "python_full_version >= '3.10'" }, + { name = "openapi-pydantic", marker = "python_full_version >= '3.10'" }, + { name = "opentelemetry-api", marker = "python_full_version >= '3.10'" }, + { name = "packaging", marker = "python_full_version >= '3.10'" }, + { name = "platformdirs", version = "4.5.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.10'" }, + { name = "py-key-value-aio", extra = ["filetree", "keyring", "memory"], marker = "python_full_version >= '3.10'" }, + { name = "pydantic", extra = ["email"], marker = "python_full_version >= '3.10'" }, + { name = "pyperclip", marker = "python_full_version >= '3.10'" }, + { name = "python-dotenv", marker = "python_full_version >= '3.10'" }, + { name = "pyyaml", marker = "python_full_version >= '3.10'" }, + { name = "rich", marker = "python_full_version >= '3.10'" }, + { name = "uncalled-for", marker = "python_full_version >= '3.10'" }, + { name = "uvicorn", marker = "python_full_version >= '3.10'" }, + { name = "watchfiles", marker = "python_full_version >= '3.10'" }, + { name = "websockets", marker = "python_full_version >= '3.10'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/25/83/c95d3bf717698a693eccb43e137a32939d2549876e884e246028bff6ecce/fastmcp-3.1.1.tar.gz", hash = "sha256:db184b5391a31199323766a3abf3a8bfbb8010479f77eca84c0e554f18655c48", size = 17347644, upload-time = "2026-03-14T19:12:20.235Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/70/ea/570122de7e24f72138d006f799768e14cc1ccf7fcb22b7750b2bd276c711/fastmcp-3.1.1-py3-none-any.whl", hash = "sha256:8132ba069d89f14566b3266919d6d72e2ec23dd45d8944622dca407e9beda7eb", size = 633754, upload-time = "2026-03-14T19:12:22.736Z" }, +] + [[package]] name = "ffmpeg-python" version = "0.2.0" @@ -2119,6 +2261,42 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/d1/b3/8def84f539e7d2289a02f0524b944b15d7c75dab7628bedf1c4f0992029c/isort-5.13.2-py3-none-any.whl", hash = "sha256:8ca5e72a8d85860d5a3fa69b8745237f2939afe12dbf656afbcb47fe72d947a6", size = 92310, upload-time = "2023-12-13T20:37:23.244Z" }, ] +[[package]] +name = "jaraco-classes" +version = "3.4.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "more-itertools", marker = "python_full_version >= '3.10'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/06/c0/ed4a27bc5571b99e3cff68f8a9fa5b56ff7df1c2251cc715a652ddd26402/jaraco.classes-3.4.0.tar.gz", hash = "sha256:47a024b51d0239c0dd8c8540c6c7f484be3b8fcf0b2d85c13825780d3b3f3acd", size = 11780, upload-time = "2024-03-31T07:27:36.643Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7f/66/b15ce62552d84bbfcec9a4873ab79d993a1dd4edb922cbfccae192bd5b5f/jaraco.classes-3.4.0-py3-none-any.whl", hash = "sha256:f662826b6bed8cace05e7ff873ce0f9283b5c924470fe664fff1c2f00f581790", size = 6777, upload-time = "2024-03-31T07:27:34.792Z" }, +] + +[[package]] +name = "jaraco-context" +version = "6.1.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "backports-tarfile", marker = "python_full_version >= '3.10' and python_full_version < '3.12'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/af/50/4763cd07e722bb6285316d390a164bc7e479db9d90daa769f22578f698b4/jaraco_context-6.1.2.tar.gz", hash = "sha256:f1a6c9d391e661cc5b8d39861ff077a7dc24dc23833ccee564b234b81c82dfe3", size = 16801, upload-time = "2026-03-20T22:13:33.922Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/f2/58/bc8954bda5fcda97bd7c19be11b85f91973d67a706ed4a3aec33e7de22db/jaraco_context-6.1.2-py3-none-any.whl", hash = "sha256:bf8150b79a2d5d91ae48629d8b427a8f7ba0e1097dd6202a9059f29a36379535", size = 7871, upload-time = "2026-03-20T22:13:32.808Z" }, +] + +[[package]] +name = "jaraco-functools" +version = "4.4.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "more-itertools", marker = "python_full_version >= '3.10'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/0f/27/056e0638a86749374d6f57d0b0db39f29509cce9313cf91bdc0ac4d91084/jaraco_functools-4.4.0.tar.gz", hash = "sha256:da21933b0417b89515562656547a77b4931f98176eb173644c0d35032a33d6bb", size = 19943, upload-time = "2025-12-21T09:29:43.6Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/fd/c4/813bb09f0985cb21e959f21f2464169eca882656849adf727ac7bb7e1767/jaraco_functools-4.4.0-py3-none-any.whl", hash = "sha256:9eec1e36f45c818d9bf307c8948eb03b2b56cd44087b3cdc989abca1f20b9176", size = 10481, upload-time = "2025-12-21T09:29:42.27Z" }, +] + [[package]] name = "jedi" version = "0.19.2" @@ -2131,6 +2309,15 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/c0/5a/9cac0c82afec3d09ccd97c8b6502d48f165f9124db81b4bcb90b4af974ee/jedi-0.19.2-py2.py3-none-any.whl", hash = "sha256:a8ef22bde8490f57fe5c7681a3c83cb58874daf72b4784de3cce5b6ef6edb5b9", size = 1572278, upload-time = "2024-11-11T01:41:40.175Z" }, ] +[[package]] +name = "jeepney" +version = "0.9.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/7b/6f/357efd7602486741aa73ffc0617fb310a29b588ed0fd69c2399acbb85b0c/jeepney-0.9.0.tar.gz", hash = "sha256:cf0e9e845622b81e4a28df94c40345400256ec608d0e55bb8a3feaa9163f5732", size = 106758, upload-time = "2025-02-27T18:51:01.684Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b2/a3/e137168c9c44d18eff0376253da9f1e9234d0239e0ee230d2fee6cea8e55/jeepney-0.9.0-py3-none-any.whl", hash = "sha256:97e5714520c16fc0a45695e5365a2e11b81ea79bba796e26f9f1d178cb182683", size = 49010, upload-time = "2025-02-27T18:51:00.104Z" }, +] + [[package]] name = "jinja2" version = "3.1.6" @@ -2303,6 +2490,15 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/71/92/5e77f98553e9e75130c78900d000368476aed74276eb8ae8796f65f00918/jsonpointer-3.0.0-py2.py3-none-any.whl", hash = "sha256:13e088adc14fca8b6aa8177c044e12701e6ad4b28ff10e65f2267a90109c9942", size = 7595, upload-time = "2024-06-10T19:24:40.698Z" }, ] +[[package]] +name = "jsonref" +version = "1.1.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/aa/0d/c1f3277e90ccdb50d33ed5ba1ec5b3f0a242ed8c1b1a85d3afeb68464dca/jsonref-1.1.0.tar.gz", hash = "sha256:32fe8e1d85af0fdefbebce950af85590b22b60f9e95443176adbde4e1ecea552", size = 8814, upload-time = "2023-01-16T16:10:04.455Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0c/ec/e1db9922bceb168197a558a2b8c03a7963f1afe93517ddd3cf99f202f996/jsonref-1.1.0-py3-none-any.whl", hash = "sha256:590dc7773df6c21cbf948b5dac07a72a251db28b0238ceecce0a2abfa8ec30a9", size = 9425, upload-time = "2023-01-16T16:10:02.255Z" }, +] + [[package]] name = "jsonschema" version = "4.25.1" @@ -2319,6 +2515,20 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/bf/9c/8c95d856233c1f82500c2450b8c68576b4cf1c871db3afac5c34ff84e6fd/jsonschema-4.25.1-py3-none-any.whl", hash = "sha256:3fba0169e345c7175110351d456342c364814cfcf3b964ba4587f22915230a63", size = 90040, upload-time = "2025-08-18T17:03:48.373Z" }, ] +[[package]] +name = "jsonschema-path" +version = "0.4.5" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pathable", marker = "python_full_version >= '3.10'" }, + { name = "pyyaml", marker = "python_full_version >= '3.10'" }, + { name = "referencing", version = "0.37.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.10'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/5b/8a/7e6102f2b8bdc6705a9eb5294f8f6f9ccd3a8420e8e8e19671d1dd773251/jsonschema_path-0.4.5.tar.gz", hash = "sha256:c6cd7d577ae290c7defd4f4029e86fdb248ca1bd41a07557795b3c95e5144918", size = 15113, upload-time = "2026-03-03T09:56:46.87Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/04/d5/4e96c44f6c1ea3d812cf5391d81a4f5abaa540abf8d04ecd7f66e0ed11df/jsonschema_path-0.4.5-py3-none-any.whl", hash = "sha256:7d77a2c3f3ec569a40efe5c5f942c44c1af2a6f96fe0866794c9ef5b8f87fd65", size = 19368, upload-time = "2026-03-03T09:56:45.39Z" }, +] + [[package]] name = "jsonschema-specifications" version = "2025.9.1" @@ -2416,6 +2626,24 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/b1/dd/ead9d8ea85bf202d90cc513b533f9c363121c7792674f78e0d8a854b63b4/jupyterlab_pygments-0.3.0-py3-none-any.whl", hash = "sha256:841a89020971da1d8693f1a99997aefc5dc424bb1b251fd6322462a1b8842780", size = 15884, upload-time = "2023-11-23T09:26:34.325Z" }, ] +[[package]] +name = "keyring" +version = "25.7.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "importlib-metadata", marker = "python_full_version >= '3.10' and python_full_version < '3.12'" }, + { name = "jaraco-classes", marker = "python_full_version >= '3.10'" }, + { name = "jaraco-context", marker = "python_full_version >= '3.10'" }, + { name = "jaraco-functools", marker = "python_full_version >= '3.10'" }, + { name = "jeepney", marker = "python_full_version >= '3.10' and sys_platform == 'linux'" }, + { name = "pywin32-ctypes", marker = "python_full_version >= '3.10' and sys_platform == 'win32'" }, + { name = "secretstorage", marker = "python_full_version >= '3.10' and sys_platform == 'linux'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/43/4b/674af6ef2f97d56f0ab5153bf0bfa28ccb6c3ed4d1babf4305449668807b/keyring-25.7.0.tar.gz", hash = "sha256:fe01bd85eb3f8fb3dd0405defdeac9a5b4f6f0439edbb3149577f244a2e8245b", size = 63516, upload-time = "2025-11-16T16:26:09.482Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/81/db/e655086b7f3a705df045bf0933bdd9c2f79bb3c97bfef1384598bb79a217/keyring-25.7.0-py3-none-any.whl", hash = "sha256:be4a0b195f149690c166e850609a477c532ddbfbaed96a404d4e43f8d5e2689f", size = 39160, upload-time = "2025-11-16T16:26:08.402Z" }, +] + [[package]] name = "langcache" version = "0.11.0" @@ -2657,6 +2885,31 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/27/1a/1f68f9ba0c207934b35b86a8ca3aad8395a3d6dd7921c0686e23853ff5a9/mccabe-0.7.0-py2.py3-none-any.whl", hash = "sha256:6c2d30ab6be0e4a46919781807b4f0d834ebdd6c6e3dca0bda5a15f863427b6e", size = 7350, upload-time = "2022-01-24T01:14:49.62Z" }, ] +[[package]] +name = "mcp" +version = "1.26.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio", marker = "python_full_version >= '3.10'" }, + { name = "httpx", marker = "python_full_version >= '3.10'" }, + { name = "httpx-sse", marker = "python_full_version >= '3.10'" }, + { name = "jsonschema", marker = "python_full_version >= '3.10'" }, + { name = "pydantic", marker = "python_full_version >= '3.10'" }, + { name = "pydantic-settings", version = "2.13.1", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.10'" }, + { name = "pyjwt", extra = ["crypto"], marker = "python_full_version >= '3.10'" }, + { name = "python-multipart", marker = "python_full_version >= '3.10'" }, + { name = "pywin32", marker = "python_full_version >= '3.10' and sys_platform == 'win32'" }, + { name = "sse-starlette", marker = "python_full_version >= '3.10'" }, + { name = "starlette", marker = "python_full_version >= '3.10'" }, + { name = "typing-extensions", marker = "python_full_version >= '3.10'" }, + { name = "typing-inspection", marker = "python_full_version >= '3.10'" }, + { name = "uvicorn", marker = "python_full_version >= '3.10' and sys_platform != 'emscripten'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/fc/6d/62e76bbb8144d6ed86e202b5edd8a4cb631e7c8130f3f4893c3f90262b10/mcp-1.26.0.tar.gz", hash = "sha256:db6e2ef491eecc1a0d93711a76f28dec2e05999f93afd48795da1c1137142c66", size = 608005, upload-time = "2026-01-24T19:40:32.468Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/fd/d9/eaa1f80170d2b7c5ba23f3b59f766f3a0bb41155fbc32a69adfa1adaaef9/mcp-1.26.0-py3-none-any.whl", hash = "sha256:904a21c33c25aa98ddbeb47273033c435e595bbacfdb177f4bd87f6dceebe1ca", size = 233615, upload-time = "2026-01-24T19:40:30.652Z" }, +] + [[package]] name = "mdit-py-plugins" version = "0.4.2" @@ -2777,6 +3030,15 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/70/3b/f801c69027866ea6e387224551185fedef62ad8e2e71181ec0d9dda905f7/ml_dtypes-0.5.3-cp39-cp39-win_amd64.whl", hash = "sha256:a4f39b9bf6555fab9bfb536cf5fdd1c1c727e8d22312078702e9ff005354b37f", size = 206567, upload-time = "2025-07-29T18:39:18.047Z" }, ] +[[package]] +name = "more-itertools" +version = "10.8.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ea/5d/38b681d3fce7a266dd9ab73c66959406d565b3e85f21d5e66e1181d93721/more_itertools-10.8.0.tar.gz", hash = "sha256:f638ddf8a1a0d134181275fb5d58b086ead7c6a72429ad725c67503f13ba30bd", size = 137431, upload-time = "2025-09-02T15:23:11.018Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a4/8e/469e5a4a2f5855992e425f3cb33804cc07bf18d48f2db061aec61ce50270/more_itertools-10.8.0-py3-none-any.whl", hash = "sha256:52d4362373dcf7c52546bc4af9a86ee7c4579df9a8dc268be0a2f949d376cc9b", size = 69667, upload-time = "2025-09-02T15:23:09.635Z" }, +] + [[package]] name = "mpmath" version = "1.3.0" @@ -3616,6 +3878,31 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/c0/0a/58e9dcd34abe273eaeac3807a8483073767b5609d01bb78ea2f048e515a0/openai-2.6.0-py3-none-any.whl", hash = "sha256:f33fa12070fe347b5787a7861c8dd397786a4a17e1c3186e239338dac7e2e743", size = 1005403, upload-time = "2025-10-20T17:17:22.091Z" }, ] +[[package]] +name = "openapi-pydantic" +version = "0.5.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pydantic", marker = "python_full_version >= '3.10'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/02/2e/58d83848dd1a79cb92ed8e63f6ba901ca282c5f09d04af9423ec26c56fd7/openapi_pydantic-0.5.1.tar.gz", hash = "sha256:ff6835af6bde7a459fb93eb93bb92b8749b754fc6e51b2f1590a19dc3005ee0d", size = 60892, upload-time = "2025-01-08T19:29:27.083Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/12/cf/03675d8bd8ecbf4445504d8071adab19f5f993676795708e36402ab38263/openapi_pydantic-0.5.1-py3-none-any.whl", hash = "sha256:a3a09ef4586f5bd760a8df7f43028b60cafb6d9f61de2acba9574766255ab146", size = 96381, upload-time = "2025-01-08T19:29:25.275Z" }, +] + +[[package]] +name = "opentelemetry-api" +version = "1.40.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "importlib-metadata", marker = "python_full_version >= '3.10'" }, + { name = "typing-extensions", marker = "python_full_version >= '3.10'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/2c/1d/4049a9e8698361cc1a1aa03a6c59e4fa4c71e0c0f94a30f988a6876a2ae6/opentelemetry_api-1.40.0.tar.gz", hash = "sha256:159be641c0b04d11e9ecd576906462773eb97ae1b657730f0ecf64d32071569f", size = 70851, upload-time = "2026-03-04T14:17:21.555Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/5f/bf/93795954016c522008da367da292adceed71cca6ee1717e1d64c83089099/opentelemetry_api-1.40.0-py3-none-any.whl", hash = "sha256:82dd69331ae74b06f6a874704be0cfaa49a1650e1537d4a813b86ecef7d0ecf9", size = 68676, upload-time = "2026-03-04T14:17:01.24Z" }, +] + [[package]] name = "orjson" version = "3.11.3" @@ -3733,6 +4020,15 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/16/32/f8e3c85d1d5250232a5d3477a2a28cc291968ff175caeadaf3cc19ce0e4a/parso-0.8.5-py2.py3-none-any.whl", hash = "sha256:646204b5ee239c396d040b90f9e272e9a8017c630092bf59980beb62fd033887", size = 106668, upload-time = "2025-08-23T15:15:25.663Z" }, ] +[[package]] +name = "pathable" +version = "0.5.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/72/55/b748445cb4ea6b125626f15379be7c96d1035d4fa3e8fee362fa92298abf/pathable-0.5.0.tar.gz", hash = "sha256:d81938348a1cacb525e7c75166270644782c0fb9c8cecc16be033e71427e0ef1", size = 16655, upload-time = "2026-02-20T08:47:00.748Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/52/96/5a770e5c461462575474468e5af931cff9de036e7c2b4fea23c1c58d2cbe/pathable-0.5.0-py3-none-any.whl", hash = "sha256:646e3d09491a6351a0c82632a09c02cdf70a252e73196b36d8a15ba0a114f0a6", size = 16867, upload-time = "2026-02-20T08:46:59.536Z" }, +] + [[package]] name = "pathspec" version = "0.12.1" @@ -4240,6 +4536,31 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/8e/37/efad0257dc6e593a18957422533ff0f87ede7c9c6ea010a2177d738fb82f/pure_eval-0.2.3-py3-none-any.whl", hash = "sha256:1db8e35b67b3d218d818ae653e27f06c3aa420901fa7b081ca98cbedc874e0d0", size = 11842, upload-time = "2024-07-21T12:58:20.04Z" }, ] +[[package]] +name = "py-key-value-aio" +version = "0.4.4" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "beartype", marker = "python_full_version >= '3.10'" }, + { name = "typing-extensions", marker = "python_full_version >= '3.10'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/04/3c/0397c072a38d4bc580994b42e0c90c5f44f679303489e4376289534735e5/py_key_value_aio-0.4.4.tar.gz", hash = "sha256:e3012e6243ed7cc09bb05457bd4d03b1ba5c2b1ca8700096b3927db79ffbbe55", size = 92300, upload-time = "2026-02-16T21:21:43.245Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/32/69/f1b537ee70b7def42d63124a539ed3026a11a3ffc3086947a1ca6e861868/py_key_value_aio-0.4.4-py3-none-any.whl", hash = "sha256:18e17564ecae61b987f909fc2cd41ee2012c84b4b1dcb8c055cf8b4bc1bf3f5d", size = 152291, upload-time = "2026-02-16T21:21:44.241Z" }, +] + +[package.optional-dependencies] +filetree = [ + { name = "aiofile", marker = "python_full_version >= '3.10'" }, + { name = "anyio", marker = "python_full_version >= '3.10'" }, +] +keyring = [ + { name = "keyring", marker = "python_full_version >= '3.10'" }, +] +memory = [ + { name = "cachetools", marker = "python_full_version >= '3.10'" }, +] + [[package]] name = "pyasn1" version = "0.6.1" @@ -4285,6 +4606,11 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/a1/6b/83661fa77dcefa195ad5f8cd9af3d1a7450fd57cc883ad04d65446ac2029/pydantic-2.12.3-py3-none-any.whl", hash = "sha256:6986454a854bc3bc6e5443e1369e06a3a456af9d339eda45510f517d9ea5c6bf", size = 462431, upload-time = "2025-10-17T15:04:19.346Z" }, ] +[package.optional-dependencies] +email = [ + { name = "email-validator", marker = "python_full_version >= '3.10'" }, +] + [[package]] name = "pydantic-core" version = "2.41.4" @@ -4412,6 +4738,44 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/48/f7/925f65d930802e3ea2eb4d5afa4cb8730c8dc0d2cb89a59dc4ed2fcb2d74/pydantic_core-2.41.4-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:c173ddcd86afd2535e2b695217e82191580663a1d1928239f877f5a1649ef39f", size = 2147775, upload-time = "2025-10-14T10:23:45.406Z" }, ] +[[package]] +name = "pydantic-settings" +version = "2.11.0" +source = { registry = "https://pypi.org/simple" } +resolution-markers = [ + "python_full_version < '3.10'", +] +dependencies = [ + { name = "pydantic", marker = "python_full_version < '3.10'" }, + { name = "python-dotenv", marker = "python_full_version < '3.10'" }, + { name = "typing-inspection", marker = "python_full_version < '3.10'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/20/c5/dbbc27b814c71676593d1c3f718e6cd7d4f00652cefa24b75f7aa3efb25e/pydantic_settings-2.11.0.tar.gz", hash = "sha256:d0e87a1c7d33593beb7194adb8470fc426e95ba02af83a0f23474a04c9a08180", size = 188394, upload-time = "2025-09-24T14:19:11.764Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/83/d6/887a1ff844e64aa823fb4905978d882a633cfe295c32eacad582b78a7d8b/pydantic_settings-2.11.0-py3-none-any.whl", hash = "sha256:fe2cea3413b9530d10f3a5875adffb17ada5c1e1bab0b2885546d7310415207c", size = 48608, upload-time = "2025-09-24T14:19:10.015Z" }, +] + +[[package]] +name = "pydantic-settings" +version = "2.13.1" +source = { registry = "https://pypi.org/simple" } +resolution-markers = [ + "python_full_version >= '3.14'", + "python_full_version == '3.13.*'", + "python_full_version == '3.12.*'", + "python_full_version == '3.11.*'", + "python_full_version == '3.10.*'", +] +dependencies = [ + { name = "pydantic", marker = "python_full_version >= '3.10'" }, + { name = "python-dotenv", marker = "python_full_version >= '3.10'" }, + { name = "typing-inspection", marker = "python_full_version >= '3.10'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/52/6d/fffca34caecc4a3f97bda81b2098da5e8ab7efc9a66e819074a11955d87e/pydantic_settings-2.13.1.tar.gz", hash = "sha256:b4c11847b15237fb0171e1462bf540e294affb9b86db4d9aa5c01730bdbe4025", size = 223826, upload-time = "2026-02-19T13:45:08.055Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/00/4b/ccc026168948fec4f7555b9164c724cf4125eac006e176541483d2c959be/pydantic_settings-2.13.1-py3-none-any.whl", hash = "sha256:d56fd801823dbeae7f0975e1f8c8e25c258eb75d278ea7abb5d9cebb01b56237", size = 58929, upload-time = "2026-02-19T13:45:06.034Z" }, +] + [[package]] name = "pydata-sphinx-theme" version = "0.15.4" @@ -4440,6 +4804,20 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/c7/21/705964c7812476f378728bdf590ca4b771ec72385c533964653c68e86bdc/pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b", size = 1225217, upload-time = "2025-06-21T13:39:07.939Z" }, ] +[[package]] +name = "pyjwt" +version = "2.11.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/5c/5a/b46fa56bf322901eee5b0454a34343cdbdae202cd421775a8ee4e42fd519/pyjwt-2.11.0.tar.gz", hash = "sha256:35f95c1f0fbe5d5ba6e43f00271c275f7a1a4db1dab27bf708073b75318ea623", size = 98019, upload-time = "2026-01-30T19:59:55.694Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/6f/01/c26ce75ba460d5cd503da9e13b21a33804d38c2165dec7b716d06b13010c/pyjwt-2.11.0-py3-none-any.whl", hash = "sha256:94a6bde30eb5c8e04fee991062b534071fd1439ef58d2adc9ccb823e7bcd0469", size = 28224, upload-time = "2026-01-30T19:59:54.539Z" }, +] + +[package.optional-dependencies] +crypto = [ + { name = "cryptography", marker = "python_full_version >= '3.10'" }, +] + [[package]] name = "pylint" version = "3.3.9" @@ -4461,6 +4839,15 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/1a/a7/69460c4a6af7575449e615144aa2205b89408dc2969b87bc3df2f262ad0b/pylint-3.3.9-py3-none-any.whl", hash = "sha256:01f9b0462c7730f94786c283f3e52a1fbdf0494bbe0971a78d7277ef46a751e7", size = 523465, upload-time = "2025-10-05T18:41:41.766Z" }, ] +[[package]] +name = "pyperclip" +version = "1.11.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/e8/52/d87eba7cb129b81563019d1679026e7a112ef76855d6159d24754dbd2a51/pyperclip-1.11.0.tar.gz", hash = "sha256:244035963e4428530d9e3a6101a1ef97209c6825edab1567beac148ccc1db1b6", size = 12185, upload-time = "2025-09-26T14:40:37.245Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/df/80/fc9d01d5ed37ba4c42ca2b55b4339ae6e200b456be3a1aaddf4a9fa99b8c/pyperclip-1.11.0-py3-none-any.whl", hash = "sha256:299403e9ff44581cb9ba2ffeed69c7aa96a008622ad0c46cb575ca75b5b84273", size = 11063, upload-time = "2025-09-26T14:40:36.069Z" }, +] + [[package]] name = "pytest" version = "8.4.2" @@ -4531,6 +4918,15 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/5f/ed/539768cf28c661b5b068d66d96a2f155c4971a5d55684a514c1a0e0dec2f/python_dotenv-1.1.1-py3-none-any.whl", hash = "sha256:31f23644fe2602f88ff55e1f5c79ba497e01224ee7737937930c448e4d0e24dc", size = 20556, upload-time = "2025-06-24T04:21:06.073Z" }, ] +[[package]] +name = "python-multipart" +version = "0.0.22" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/94/01/979e98d542a70714b0cb2b6728ed0b7c46792b695e3eaec3e20711271ca3/python_multipart-0.0.22.tar.gz", hash = "sha256:7340bef99a7e0032613f56dc36027b959fd3b30a787ed62d310e951f7c3a3a58", size = 37612, upload-time = "2026-01-25T10:15:56.219Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/1b/d0/397f9626e711ff749a95d96b7af99b9c566a9bb5129b8e4c10fc4d100304/python_multipart-0.0.22-py3-none-any.whl", hash = "sha256:2b2cd894c83d21bf49d702499531c7bafd057d730c201782048f7945d82de155", size = 24579, upload-time = "2026-01-25T10:15:54.811Z" }, +] + [[package]] name = "python-ulid" version = "3.1.0" @@ -4574,6 +4970,15 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/60/22/e0e8d802f124772cec9c75430b01a212f86f9de7546bda715e54140d5aeb/pywin32-311-cp39-cp39-win_arm64.whl", hash = "sha256:62ea666235135fee79bb154e695f3ff67370afefd71bd7fea7512fc70ef31e3d", size = 8778162, upload-time = "2025-07-14T20:13:03.544Z" }, ] +[[package]] +name = "pywin32-ctypes" +version = "0.2.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/85/9f/01a1a99704853cb63f253eea009390c88e7131c67e66a0a02099a8c917cb/pywin32-ctypes-0.2.3.tar.gz", hash = "sha256:d162dc04946d704503b2edc4d55f3dba5c1d539ead017afa00142c38b9885755", size = 29471, upload-time = "2024-08-14T10:15:34.626Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/de/3d/8161f7711c017e01ac9f008dfddd9410dff3674334c233bde66e7ba65bbf/pywin32_ctypes-0.2.3-py3-none-any.whl", hash = "sha256:8a1513379d709975552d202d942d9837758905c8d01eb82b8bcc30918929e7b8", size = 30756, upload-time = "2024-08-14T10:15:33.187Z" }, +] + [[package]] name = "pyyaml" version = "6.0.3" @@ -4816,6 +5221,11 @@ cohere = [ langcache = [ { name = "langcache" }, ] +mcp = [ + { name = "fastmcp", marker = "python_full_version >= '3.10'" }, + { name = "pydantic-settings", version = "2.11.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.10'" }, + { name = "pydantic-settings", version = "2.13.1", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.10'" }, +] mistralai = [ { name = "mistralai" }, ] @@ -4878,6 +5288,7 @@ requires-dist = [ { name = "boto3", marker = "extra == 'bedrock'", specifier = ">=1.36.0,<2" }, { name = "cohere", marker = "extra == 'all'", specifier = ">=4.44" }, { name = "cohere", marker = "extra == 'cohere'", specifier = ">=4.44" }, + { name = "fastmcp", marker = "python_full_version >= '3.10' and extra == 'mcp'", specifier = ">=2.0.0" }, { name = "google-cloud-aiplatform", marker = "extra == 'all'", specifier = ">=1.26,<2.0.0" }, { name = "google-cloud-aiplatform", marker = "extra == 'vertexai'", specifier = ">=1.26,<2.0.0" }, { name = "jsonpath-ng", specifier = ">=1.5.0" }, @@ -4896,6 +5307,7 @@ requires-dist = [ { name = "protobuf", marker = "extra == 'all'", specifier = ">=5.28.0,<6.0.0" }, { name = "protobuf", marker = "extra == 'vertexai'", specifier = ">=5.28.0,<6.0.0" }, { name = "pydantic", specifier = ">=2,<3" }, + { name = "pydantic-settings", marker = "extra == 'mcp'", specifier = ">=2.0" }, { name = "python-ulid", specifier = ">=3.0.0" }, { name = "pyyaml", specifier = ">=5.4,<7.0" }, { name = "redis", specifier = ">=5.0,<8.0" }, @@ -4909,7 +5321,7 @@ requires-dist = [ { name = "voyageai", marker = "extra == 'all'", specifier = ">=0.2.2" }, { name = "voyageai", marker = "extra == 'voyageai'", specifier = ">=0.2.2" }, ] -provides-extras = ["mistralai", "openai", "nltk", "cohere", "voyageai", "sentence-transformers", "langcache", "vertexai", "bedrock", "pillow", "sql-redis", "all"] +provides-extras = ["mcp", "mistralai", "openai", "nltk", "cohere", "voyageai", "sentence-transformers", "langcache", "vertexai", "bedrock", "pillow", "sql-redis", "all"] [package.metadata.requires-dev] dev = [ @@ -5128,6 +5540,32 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/3f/51/d4db610ef29373b879047326cbf6fa98b6c1969d6f6dc423279de2b1be2c/requests_toolbelt-1.0.0-py2.py3-none-any.whl", hash = "sha256:cccfdd665f0a24fcf4726e690f65639d272bb0637b9b92dfd91a5568ccf6bd06", size = 54481, upload-time = "2023-05-01T04:11:28.427Z" }, ] +[[package]] +name = "rich" +version = "14.3.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "markdown-it-py", marker = "python_full_version >= '3.10'" }, + { name = "pygments", marker = "python_full_version >= '3.10'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/b3/c6/f3b320c27991c46f43ee9d856302c70dc2d0fb2dba4842ff739d5f46b393/rich-14.3.3.tar.gz", hash = "sha256:b8daa0b9e4eef54dd8cf7c86c03713f53241884e814f4e2f5fb342fe520f639b", size = 230582, upload-time = "2026-02-19T17:23:12.474Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/14/25/b208c5683343959b670dc001595f2f3737e051da617f66c31f7c4fa93abc/rich-14.3.3-py3-none-any.whl", hash = "sha256:793431c1f8619afa7d3b52b2cdec859562b950ea0d4b6b505397612db8d5362d", size = 310458, upload-time = "2026-02-19T17:23:13.732Z" }, +] + +[[package]] +name = "rich-rst" +version = "1.3.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "docutils", marker = "python_full_version >= '3.10'" }, + { name = "rich", marker = "python_full_version >= '3.10'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/bc/6d/a506aaa4a9eaa945ed8ab2b7347859f53593864289853c5d6d62b77246e0/rich_rst-1.3.2.tar.gz", hash = "sha256:a1196fdddf1e364b02ec68a05e8ff8f6914fee10fbca2e6b6735f166bb0da8d4", size = 14936, upload-time = "2025-10-14T16:49:45.332Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/13/2f/b4530fbf948867702d0a3f27de4a6aab1d156f406d72852ab902c4d04de9/rich_rst-1.3.2-py3-none-any.whl", hash = "sha256:a99b4907cbe118cf9d18b0b44de272efa61f15117c61e39ebdc431baf5df722a", size = 12567, upload-time = "2025-10-14T16:49:42.953Z" }, +] + [[package]] name = "rpds-py" version = "0.27.1" @@ -5609,6 +6047,19 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/97/30/2f9a5243008f76dfc5dee9a53dfb939d9b31e16ce4bd4f2e628bfc5d89d2/scipy-1.16.2-cp314-cp314t-win_arm64.whl", hash = "sha256:d2a4472c231328d4de38d5f1f68fdd6d28a615138f842580a8a321b5845cf779", size = 26448374, upload-time = "2025-09-11T17:45:03.45Z" }, ] +[[package]] +name = "secretstorage" +version = "3.5.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "cryptography", marker = "python_full_version >= '3.10'" }, + { name = "jeepney", marker = "python_full_version >= '3.10'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/1c/03/e834bcd866f2f8a49a85eaff47340affa3bfa391ee9912a952a1faa68c7b/secretstorage-3.5.0.tar.gz", hash = "sha256:f04b8e4689cbce351744d5537bf6b1329c6fc68f91fa666f60a380edddcd11be", size = 19884, upload-time = "2025-11-23T19:02:53.191Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b7/46/f5af3402b579fd5e11573ce652019a67074317e18c1935cc0b4ba9b35552/secretstorage-3.5.0-py3-none-any.whl", hash = "sha256:0ce65888c0725fcb2c5bc0fdb8e5438eece02c523557ea40ce0703c266248137", size = 15554, upload-time = "2025-11-23T19:02:51.545Z" }, +] + [[package]] name = "sentence-transformers" version = "3.4.1" @@ -6004,6 +6455,19 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/8f/a6/21b1e19994296ba4a34bc7abaf4fcb40d7e7787477bdfde58cd843594459/sqlglot-28.6.0-py3-none-any.whl", hash = "sha256:8af76e825dc8456a49f8ce049d69bbfcd116655dda3e53051754789e2edf8eba", size = 575186, upload-time = "2026-01-13T17:39:22.327Z" }, ] +[[package]] +name = "sse-starlette" +version = "3.3.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio", marker = "python_full_version >= '3.10'" }, + { name = "starlette", marker = "python_full_version >= '3.10'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/5a/9f/c3695c2d2d4ef70072c3a06992850498b01c6bc9be531950813716b426fa/sse_starlette-3.3.2.tar.gz", hash = "sha256:678fca55a1945c734d8472a6cad186a55ab02840b4f6786f5ee8770970579dcd", size = 32326, upload-time = "2026-02-28T11:24:34.36Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/61/28/8cb142d3fe80c4a2d8af54ca0b003f47ce0ba920974e7990fa6e016402d1/sse_starlette-3.3.2-py3-none-any.whl", hash = "sha256:5c3ea3dad425c601236726af2f27689b74494643f57017cafcb6f8c9acfbb862", size = 14270, upload-time = "2026-02-28T11:24:32.984Z" }, +] + [[package]] name = "stack-data" version = "0.6.3" @@ -6018,6 +6482,19 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/f1/7b/ce1eafaf1a76852e2ec9b22edecf1daa58175c090266e9f6c64afcd81d91/stack_data-0.6.3-py3-none-any.whl", hash = "sha256:d5558e0c25a4cb0853cddad3d77da9891a08cb85dd9f9f91b9f8cd66e511e695", size = 24521, upload-time = "2023-09-30T13:58:03.53Z" }, ] +[[package]] +name = "starlette" +version = "0.52.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio", marker = "python_full_version >= '3.10'" }, + { name = "typing-extensions", marker = "python_full_version >= '3.10' and python_full_version < '3.13'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/c4/68/79977123bb7be889ad680d79a40f339082c1978b5cfcf62c2d8d196873ac/starlette-0.52.1.tar.gz", hash = "sha256:834edd1b0a23167694292e94f597773bc3f89f362be6effee198165a35d62933", size = 2653702, upload-time = "2026-01-18T13:34:11.062Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/81/0d/13d1d239a25cbfb19e740db83143e95c772a1fe10202dda4b76792b114dd/starlette-0.52.1-py3-none-any.whl", hash = "sha256:0029d43eb3d273bc4f83a08720b4912ea4b071087a3b48db01b7c839f7954d74", size = 74272, upload-time = "2026-01-18T13:34:09.188Z" }, +] + [[package]] name = "sympy" version = "1.14.0" @@ -6506,6 +6983,15 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/dc/9b/47798a6c91d8bdb567fe2698fe81e0c6b7cb7ef4d13da4114b41d239f65d/typing_inspection-0.4.2-py3-none-any.whl", hash = "sha256:4ed1cacbdc298c220f1bd249ed5287caa16f34d44ef4e9c3d0cbad5b521545e7", size = 14611, upload-time = "2025-10-01T02:14:40.154Z" }, ] +[[package]] +name = "uncalled-for" +version = "0.2.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/02/7c/b5b7d8136f872e3f13b0584e576886de0489d7213a12de6bebf29ff6ebfc/uncalled_for-0.2.0.tar.gz", hash = "sha256:b4f8fdbcec328c5a113807d653e041c5094473dd4afa7c34599ace69ccb7e69f", size = 49488, upload-time = "2026-02-27T17:40:58.137Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ff/7f/4320d9ce3be404e6310b915c3629fe27bf1e2f438a1a7a3cb0396e32e9a9/uncalled_for-0.2.0-py3-none-any.whl", hash = "sha256:2c0bd338faff5f930918f79e7eb9ff48290df2cb05fcc0b40a7f334e55d4d85f", size = 11351, upload-time = "2026-02-27T17:40:56.804Z" }, +] + [[package]] name = "urllib3" version = "1.26.20" @@ -6534,6 +7020,20 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/96/94/c31f58c7a7f470d5665935262ebd7455c7e4c7782eb525658d3dbf4b9403/urllib3-2.1.0-py3-none-any.whl", hash = "sha256:55901e917a5896a349ff771be919f8bd99aff50b79fe58fec595eb37bbc56bb3", size = 104579, upload-time = "2023-11-13T12:29:42.719Z" }, ] +[[package]] +name = "uvicorn" +version = "0.41.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "click", version = "8.3.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.10'" }, + { name = "h11", marker = "python_full_version >= '3.10'" }, + { name = "typing-extensions", marker = "python_full_version == '3.10.*'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/32/ce/eeb58ae4ac36fe09e3842eb02e0eb676bf2c53ae062b98f1b2531673efdd/uvicorn-0.41.0.tar.gz", hash = "sha256:09d11cf7008da33113824ee5a1c6422d89fbc2ff476540d69a34c87fab8b571a", size = 82633, upload-time = "2026-02-16T23:07:24.1Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/83/e4/d04a086285c20886c0daad0e026f250869201013d18f81d9ff5eada73a88/uvicorn-0.41.0-py3-none-any.whl", hash = "sha256:29e35b1d2c36a04b9e180d4007ede3bcb32a85fbdfd6c6aeb3f26839de088187", size = 68783, upload-time = "2026-02-16T23:07:22.357Z" }, +] + [[package]] name = "virtualenv" version = "20.35.3" @@ -6576,6 +7076,125 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/bf/e9/e13785fb2a3c605ea924ce2e54d235e423f6d6e45ddb09574963655ec111/voyageai-0.3.6-py3-none-any.whl", hash = "sha256:e282f9cef87eb949e2dd30ffe911689f1068c50b8c3c6e90e97793f2a52c83dd", size = 34465, upload-time = "2025-12-09T01:32:51.32Z" }, ] +[[package]] +name = "watchfiles" +version = "1.1.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio", marker = "python_full_version >= '3.10'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/c2/c9/8869df9b2a2d6c59d79220a4db37679e74f807c559ffe5265e08b227a210/watchfiles-1.1.1.tar.gz", hash = "sha256:a173cb5c16c4f40ab19cecf48a534c409f7ea983ab8fed0741304a1c0a31b3f2", size = 94440, upload-time = "2025-10-14T15:06:21.08Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a7/1a/206e8cf2dd86fddf939165a57b4df61607a1e0add2785f170a3f616b7d9f/watchfiles-1.1.1-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:eef58232d32daf2ac67f42dea51a2c80f0d03379075d44a587051e63cc2e368c", size = 407318, upload-time = "2025-10-14T15:04:18.753Z" }, + { url = "https://files.pythonhosted.org/packages/b3/0f/abaf5262b9c496b5dad4ed3c0e799cbecb1f8ea512ecb6ddd46646a9fca3/watchfiles-1.1.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:03fa0f5237118a0c5e496185cafa92878568b652a2e9a9382a5151b1a0380a43", size = 394478, upload-time = "2025-10-14T15:04:20.297Z" }, + { url = "https://files.pythonhosted.org/packages/b1/04/9cc0ba88697b34b755371f5ace8d3a4d9a15719c07bdc7bd13d7d8c6a341/watchfiles-1.1.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8ca65483439f9c791897f7db49202301deb6e15fe9f8fe2fed555bf986d10c31", size = 449894, upload-time = "2025-10-14T15:04:21.527Z" }, + { url = "https://files.pythonhosted.org/packages/d2/9c/eda4615863cd8621e89aed4df680d8c3ec3da6a4cf1da113c17decd87c7f/watchfiles-1.1.1-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:f0ab1c1af0cb38e3f598244c17919fb1a84d1629cc08355b0074b6d7f53138ac", size = 459065, upload-time = "2025-10-14T15:04:22.795Z" }, + { url = "https://files.pythonhosted.org/packages/84/13/f28b3f340157d03cbc8197629bc109d1098764abe1e60874622a0be5c112/watchfiles-1.1.1-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3bc570d6c01c206c46deb6e935a260be44f186a2f05179f52f7fcd2be086a94d", size = 488377, upload-time = "2025-10-14T15:04:24.138Z" }, + { url = "https://files.pythonhosted.org/packages/86/93/cfa597fa9389e122488f7ffdbd6db505b3b915ca7435ecd7542e855898c2/watchfiles-1.1.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e84087b432b6ac94778de547e08611266f1f8ffad28c0ee4c82e028b0fc5966d", size = 595837, upload-time = "2025-10-14T15:04:25.057Z" }, + { url = "https://files.pythonhosted.org/packages/57/1e/68c1ed5652b48d89fc24d6af905d88ee4f82fa8bc491e2666004e307ded1/watchfiles-1.1.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:620bae625f4cb18427b1bb1a2d9426dc0dd5a5ba74c7c2cdb9de405f7b129863", size = 473456, upload-time = "2025-10-14T15:04:26.497Z" }, + { url = "https://files.pythonhosted.org/packages/d5/dc/1a680b7458ffa3b14bb64878112aefc8f2e4f73c5af763cbf0bd43100658/watchfiles-1.1.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:544364b2b51a9b0c7000a4b4b02f90e9423d97fbbf7e06689236443ebcad81ab", size = 455614, upload-time = "2025-10-14T15:04:27.539Z" }, + { url = "https://files.pythonhosted.org/packages/61/a5/3d782a666512e01eaa6541a72ebac1d3aae191ff4a31274a66b8dd85760c/watchfiles-1.1.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:bbe1ef33d45bc71cf21364df962af171f96ecaeca06bd9e3d0b583efb12aec82", size = 630690, upload-time = "2025-10-14T15:04:28.495Z" }, + { url = "https://files.pythonhosted.org/packages/9b/73/bb5f38590e34687b2a9c47a244aa4dd50c56a825969c92c9c5fc7387cea1/watchfiles-1.1.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:1a0bb430adb19ef49389e1ad368450193a90038b5b752f4ac089ec6942c4dff4", size = 622459, upload-time = "2025-10-14T15:04:29.491Z" }, + { url = "https://files.pythonhosted.org/packages/f1/ac/c9bb0ec696e07a20bd58af5399aeadaef195fb2c73d26baf55180fe4a942/watchfiles-1.1.1-cp310-cp310-win32.whl", hash = "sha256:3f6d37644155fb5beca5378feb8c1708d5783145f2a0f1c4d5a061a210254844", size = 272663, upload-time = "2025-10-14T15:04:30.435Z" }, + { url = "https://files.pythonhosted.org/packages/11/a0/a60c5a7c2ec59fa062d9a9c61d02e3b6abd94d32aac2d8344c4bdd033326/watchfiles-1.1.1-cp310-cp310-win_amd64.whl", hash = "sha256:a36d8efe0f290835fd0f33da35042a1bb5dc0e83cbc092dcf69bce442579e88e", size = 287453, upload-time = "2025-10-14T15:04:31.53Z" }, + { url = "https://files.pythonhosted.org/packages/1f/f8/2c5f479fb531ce2f0564eda479faecf253d886b1ab3630a39b7bf7362d46/watchfiles-1.1.1-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:f57b396167a2565a4e8b5e56a5a1c537571733992b226f4f1197d79e94cf0ae5", size = 406529, upload-time = "2025-10-14T15:04:32.899Z" }, + { url = "https://files.pythonhosted.org/packages/fe/cd/f515660b1f32f65df671ddf6f85bfaca621aee177712874dc30a97397977/watchfiles-1.1.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:421e29339983e1bebc281fab40d812742268ad057db4aee8c4d2bce0af43b741", size = 394384, upload-time = "2025-10-14T15:04:33.761Z" }, + { url = "https://files.pythonhosted.org/packages/7b/c3/28b7dc99733eab43fca2d10f55c86e03bd6ab11ca31b802abac26b23d161/watchfiles-1.1.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6e43d39a741e972bab5d8100b5cdacf69db64e34eb19b6e9af162bccf63c5cc6", size = 448789, upload-time = "2025-10-14T15:04:34.679Z" }, + { url = "https://files.pythonhosted.org/packages/4a/24/33e71113b320030011c8e4316ccca04194bf0cbbaeee207f00cbc7d6b9f5/watchfiles-1.1.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:f537afb3276d12814082a2e9b242bdcf416c2e8fd9f799a737990a1dbe906e5b", size = 460521, upload-time = "2025-10-14T15:04:35.963Z" }, + { url = "https://files.pythonhosted.org/packages/f4/c3/3c9a55f255aa57b91579ae9e98c88704955fa9dac3e5614fb378291155df/watchfiles-1.1.1-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b2cd9e04277e756a2e2d2543d65d1e2166d6fd4c9b183f8808634fda23f17b14", size = 488722, upload-time = "2025-10-14T15:04:37.091Z" }, + { url = "https://files.pythonhosted.org/packages/49/36/506447b73eb46c120169dc1717fe2eff07c234bb3232a7200b5f5bd816e9/watchfiles-1.1.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5f3f58818dc0b07f7d9aa7fe9eb1037aecb9700e63e1f6acfed13e9fef648f5d", size = 596088, upload-time = "2025-10-14T15:04:38.39Z" }, + { url = "https://files.pythonhosted.org/packages/82/ab/5f39e752a9838ec4d52e9b87c1e80f1ee3ccdbe92e183c15b6577ab9de16/watchfiles-1.1.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9bb9f66367023ae783551042d31b1d7fd422e8289eedd91f26754a66f44d5cff", size = 472923, upload-time = "2025-10-14T15:04:39.666Z" }, + { url = "https://files.pythonhosted.org/packages/af/b9/a419292f05e302dea372fa7e6fda5178a92998411f8581b9830d28fb9edb/watchfiles-1.1.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aebfd0861a83e6c3d1110b78ad54704486555246e542be3e2bb94195eabb2606", size = 456080, upload-time = "2025-10-14T15:04:40.643Z" }, + { url = "https://files.pythonhosted.org/packages/b0/c3/d5932fd62bde1a30c36e10c409dc5d54506726f08cb3e1d8d0ba5e2bc8db/watchfiles-1.1.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:5fac835b4ab3c6487b5dbad78c4b3724e26bcc468e886f8ba8cc4306f68f6701", size = 629432, upload-time = "2025-10-14T15:04:41.789Z" }, + { url = "https://files.pythonhosted.org/packages/f7/77/16bddd9779fafb795f1a94319dc965209c5641db5bf1edbbccace6d1b3c0/watchfiles-1.1.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:399600947b170270e80134ac854e21b3ccdefa11a9529a3decc1327088180f10", size = 623046, upload-time = "2025-10-14T15:04:42.718Z" }, + { url = "https://files.pythonhosted.org/packages/46/ef/f2ecb9a0f342b4bfad13a2787155c6ee7ce792140eac63a34676a2feeef2/watchfiles-1.1.1-cp311-cp311-win32.whl", hash = "sha256:de6da501c883f58ad50db3a32ad397b09ad29865b5f26f64c24d3e3281685849", size = 271473, upload-time = "2025-10-14T15:04:43.624Z" }, + { url = "https://files.pythonhosted.org/packages/94/bc/f42d71125f19731ea435c3948cad148d31a64fccde3867e5ba4edee901f9/watchfiles-1.1.1-cp311-cp311-win_amd64.whl", hash = "sha256:35c53bd62a0b885bf653ebf6b700d1bf05debb78ad9292cf2a942b23513dc4c4", size = 287598, upload-time = "2025-10-14T15:04:44.516Z" }, + { url = "https://files.pythonhosted.org/packages/57/c9/a30f897351f95bbbfb6abcadafbaca711ce1162f4db95fc908c98a9165f3/watchfiles-1.1.1-cp311-cp311-win_arm64.whl", hash = "sha256:57ca5281a8b5e27593cb7d82c2ac927ad88a96ed406aa446f6344e4328208e9e", size = 277210, upload-time = "2025-10-14T15:04:45.883Z" }, + { url = "https://files.pythonhosted.org/packages/74/d5/f039e7e3c639d9b1d09b07ea412a6806d38123f0508e5f9b48a87b0a76cc/watchfiles-1.1.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:8c89f9f2f740a6b7dcc753140dd5e1ab9215966f7a3530d0c0705c83b401bd7d", size = 404745, upload-time = "2025-10-14T15:04:46.731Z" }, + { url = "https://files.pythonhosted.org/packages/a5/96/a881a13aa1349827490dab2d363c8039527060cfcc2c92cc6d13d1b1049e/watchfiles-1.1.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:bd404be08018c37350f0d6e34676bd1e2889990117a2b90070b3007f172d0610", size = 391769, upload-time = "2025-10-14T15:04:48.003Z" }, + { url = "https://files.pythonhosted.org/packages/4b/5b/d3b460364aeb8da471c1989238ea0e56bec24b6042a68046adf3d9ddb01c/watchfiles-1.1.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8526e8f916bb5b9a0a777c8317c23ce65de259422bba5b31325a6fa6029d33af", size = 449374, upload-time = "2025-10-14T15:04:49.179Z" }, + { url = "https://files.pythonhosted.org/packages/b9/44/5769cb62d4ed055cb17417c0a109a92f007114a4e07f30812a73a4efdb11/watchfiles-1.1.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2edc3553362b1c38d9f06242416a5d8e9fe235c204a4072e988ce2e5bb1f69f6", size = 459485, upload-time = "2025-10-14T15:04:50.155Z" }, + { url = "https://files.pythonhosted.org/packages/19/0c/286b6301ded2eccd4ffd0041a1b726afda999926cf720aab63adb68a1e36/watchfiles-1.1.1-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:30f7da3fb3f2844259cba4720c3fc7138eb0f7b659c38f3bfa65084c7fc7abce", size = 488813, upload-time = "2025-10-14T15:04:51.059Z" }, + { url = "https://files.pythonhosted.org/packages/c7/2b/8530ed41112dd4a22f4dcfdb5ccf6a1baad1ff6eed8dc5a5f09e7e8c41c7/watchfiles-1.1.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f8979280bdafff686ba5e4d8f97840f929a87ed9cdf133cbbd42f7766774d2aa", size = 594816, upload-time = "2025-10-14T15:04:52.031Z" }, + { url = "https://files.pythonhosted.org/packages/ce/d2/f5f9fb49489f184f18470d4f99f4e862a4b3e9ac2865688eb2099e3d837a/watchfiles-1.1.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:dcc5c24523771db3a294c77d94771abcfcb82a0e0ee8efd910c37c59ec1b31bb", size = 475186, upload-time = "2025-10-14T15:04:53.064Z" }, + { url = "https://files.pythonhosted.org/packages/cf/68/5707da262a119fb06fbe214d82dd1fe4a6f4af32d2d14de368d0349eb52a/watchfiles-1.1.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1db5d7ae38ff20153d542460752ff397fcf5c96090c1230803713cf3147a6803", size = 456812, upload-time = "2025-10-14T15:04:55.174Z" }, + { url = "https://files.pythonhosted.org/packages/66/ab/3cbb8756323e8f9b6f9acb9ef4ec26d42b2109bce830cc1f3468df20511d/watchfiles-1.1.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:28475ddbde92df1874b6c5c8aaeb24ad5be47a11f87cde5a28ef3835932e3e94", size = 630196, upload-time = "2025-10-14T15:04:56.22Z" }, + { url = "https://files.pythonhosted.org/packages/78/46/7152ec29b8335f80167928944a94955015a345440f524d2dfe63fc2f437b/watchfiles-1.1.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:36193ed342f5b9842edd3532729a2ad55c4160ffcfa3700e0d54be496b70dd43", size = 622657, upload-time = "2025-10-14T15:04:57.521Z" }, + { url = "https://files.pythonhosted.org/packages/0a/bf/95895e78dd75efe9a7f31733607f384b42eb5feb54bd2eb6ed57cc2e94f4/watchfiles-1.1.1-cp312-cp312-win32.whl", hash = "sha256:859e43a1951717cc8de7f4c77674a6d389b106361585951d9e69572823f311d9", size = 272042, upload-time = "2025-10-14T15:04:59.046Z" }, + { url = "https://files.pythonhosted.org/packages/87/0a/90eb755f568de2688cb220171c4191df932232c20946966c27a59c400850/watchfiles-1.1.1-cp312-cp312-win_amd64.whl", hash = "sha256:91d4c9a823a8c987cce8fa2690923b069966dabb196dd8d137ea2cede885fde9", size = 288410, upload-time = "2025-10-14T15:05:00.081Z" }, + { url = "https://files.pythonhosted.org/packages/36/76/f322701530586922fbd6723c4f91ace21364924822a8772c549483abed13/watchfiles-1.1.1-cp312-cp312-win_arm64.whl", hash = "sha256:a625815d4a2bdca61953dbba5a39d60164451ef34c88d751f6c368c3ea73d404", size = 278209, upload-time = "2025-10-14T15:05:01.168Z" }, + { url = "https://files.pythonhosted.org/packages/bb/f4/f750b29225fe77139f7ae5de89d4949f5a99f934c65a1f1c0b248f26f747/watchfiles-1.1.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:130e4876309e8686a5e37dba7d5e9bc77e6ed908266996ca26572437a5271e18", size = 404321, upload-time = "2025-10-14T15:05:02.063Z" }, + { url = "https://files.pythonhosted.org/packages/2b/f9/f07a295cde762644aa4c4bb0f88921d2d141af45e735b965fb2e87858328/watchfiles-1.1.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:5f3bde70f157f84ece3765b42b4a52c6ac1a50334903c6eaf765362f6ccca88a", size = 391783, upload-time = "2025-10-14T15:05:03.052Z" }, + { url = "https://files.pythonhosted.org/packages/bc/11/fc2502457e0bea39a5c958d86d2cb69e407a4d00b85735ca724bfa6e0d1a/watchfiles-1.1.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:14e0b1fe858430fc0251737ef3824c54027bedb8c37c38114488b8e131cf8219", size = 449279, upload-time = "2025-10-14T15:05:04.004Z" }, + { url = "https://files.pythonhosted.org/packages/e3/1f/d66bc15ea0b728df3ed96a539c777acfcad0eb78555ad9efcaa1274688f0/watchfiles-1.1.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:f27db948078f3823a6bb3b465180db8ebecf26dd5dae6f6180bd87383b6b4428", size = 459405, upload-time = "2025-10-14T15:05:04.942Z" }, + { url = "https://files.pythonhosted.org/packages/be/90/9f4a65c0aec3ccf032703e6db02d89a157462fbb2cf20dd415128251cac0/watchfiles-1.1.1-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:059098c3a429f62fc98e8ec62b982230ef2c8df68c79e826e37b895bc359a9c0", size = 488976, upload-time = "2025-10-14T15:05:05.905Z" }, + { url = "https://files.pythonhosted.org/packages/37/57/ee347af605d867f712be7029bb94c8c071732a4b44792e3176fa3c612d39/watchfiles-1.1.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bfb5862016acc9b869bb57284e6cb35fdf8e22fe59f7548858e2f971d045f150", size = 595506, upload-time = "2025-10-14T15:05:06.906Z" }, + { url = "https://files.pythonhosted.org/packages/a8/78/cc5ab0b86c122047f75e8fc471c67a04dee395daf847d3e59381996c8707/watchfiles-1.1.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:319b27255aacd9923b8a276bb14d21a5f7ff82564c744235fc5eae58d95422ae", size = 474936, upload-time = "2025-10-14T15:05:07.906Z" }, + { url = "https://files.pythonhosted.org/packages/62/da/def65b170a3815af7bd40a3e7010bf6ab53089ef1b75d05dd5385b87cf08/watchfiles-1.1.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c755367e51db90e75b19454b680903631d41f9e3607fbd941d296a020c2d752d", size = 456147, upload-time = "2025-10-14T15:05:09.138Z" }, + { url = "https://files.pythonhosted.org/packages/57/99/da6573ba71166e82d288d4df0839128004c67d2778d3b566c138695f5c0b/watchfiles-1.1.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:c22c776292a23bfc7237a98f791b9ad3144b02116ff10d820829ce62dff46d0b", size = 630007, upload-time = "2025-10-14T15:05:10.117Z" }, + { url = "https://files.pythonhosted.org/packages/a8/51/7439c4dd39511368849eb1e53279cd3454b4a4dbace80bab88feeb83c6b5/watchfiles-1.1.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:3a476189be23c3686bc2f4321dd501cb329c0a0469e77b7b534ee10129ae6374", size = 622280, upload-time = "2025-10-14T15:05:11.146Z" }, + { url = "https://files.pythonhosted.org/packages/95/9c/8ed97d4bba5db6fdcdb2b298d3898f2dd5c20f6b73aee04eabe56c59677e/watchfiles-1.1.1-cp313-cp313-win32.whl", hash = "sha256:bf0a91bfb5574a2f7fc223cf95eeea79abfefa404bf1ea5e339c0c1560ae99a0", size = 272056, upload-time = "2025-10-14T15:05:12.156Z" }, + { url = "https://files.pythonhosted.org/packages/1f/f3/c14e28429f744a260d8ceae18bf58c1d5fa56b50d006a7a9f80e1882cb0d/watchfiles-1.1.1-cp313-cp313-win_amd64.whl", hash = "sha256:52e06553899e11e8074503c8e716d574adeeb7e68913115c4b3653c53f9bae42", size = 288162, upload-time = "2025-10-14T15:05:13.208Z" }, + { url = "https://files.pythonhosted.org/packages/dc/61/fe0e56c40d5cd29523e398d31153218718c5786b5e636d9ae8ae79453d27/watchfiles-1.1.1-cp313-cp313-win_arm64.whl", hash = "sha256:ac3cc5759570cd02662b15fbcd9d917f7ecd47efe0d6b40474eafd246f91ea18", size = 277909, upload-time = "2025-10-14T15:05:14.49Z" }, + { url = "https://files.pythonhosted.org/packages/79/42/e0a7d749626f1e28c7108a99fb9bf524b501bbbeb9b261ceecde644d5a07/watchfiles-1.1.1-cp313-cp313t-macosx_10_12_x86_64.whl", hash = "sha256:563b116874a9a7ce6f96f87cd0b94f7faf92d08d0021e837796f0a14318ef8da", size = 403389, upload-time = "2025-10-14T15:05:15.777Z" }, + { url = "https://files.pythonhosted.org/packages/15/49/08732f90ce0fbbc13913f9f215c689cfc9ced345fb1bcd8829a50007cc8d/watchfiles-1.1.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:3ad9fe1dae4ab4212d8c91e80b832425e24f421703b5a42ef2e4a1e215aff051", size = 389964, upload-time = "2025-10-14T15:05:16.85Z" }, + { url = "https://files.pythonhosted.org/packages/27/0d/7c315d4bd5f2538910491a0393c56bf70d333d51bc5b34bee8e68e8cea19/watchfiles-1.1.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ce70f96a46b894b36eba678f153f052967a0d06d5b5a19b336ab0dbbd029f73e", size = 448114, upload-time = "2025-10-14T15:05:17.876Z" }, + { url = "https://files.pythonhosted.org/packages/c3/24/9e096de47a4d11bc4df41e9d1e61776393eac4cb6eb11b3e23315b78b2cc/watchfiles-1.1.1-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:cb467c999c2eff23a6417e58d75e5828716f42ed8289fe6b77a7e5a91036ca70", size = 460264, upload-time = "2025-10-14T15:05:18.962Z" }, + { url = "https://files.pythonhosted.org/packages/cc/0f/e8dea6375f1d3ba5fcb0b3583e2b493e77379834c74fd5a22d66d85d6540/watchfiles-1.1.1-cp313-cp313t-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:836398932192dae4146c8f6f737d74baeac8b70ce14831a239bdb1ca882fc261", size = 487877, upload-time = "2025-10-14T15:05:20.094Z" }, + { url = "https://files.pythonhosted.org/packages/ac/5b/df24cfc6424a12deb41503b64d42fbea6b8cb357ec62ca84a5a3476f654a/watchfiles-1.1.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:743185e7372b7bc7c389e1badcc606931a827112fbbd37f14c537320fca08620", size = 595176, upload-time = "2025-10-14T15:05:21.134Z" }, + { url = "https://files.pythonhosted.org/packages/8f/b5/853b6757f7347de4e9b37e8cc3289283fb983cba1ab4d2d7144694871d9c/watchfiles-1.1.1-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:afaeff7696e0ad9f02cbb8f56365ff4686ab205fcf9c4c5b6fdfaaa16549dd04", size = 473577, upload-time = "2025-10-14T15:05:22.306Z" }, + { url = "https://files.pythonhosted.org/packages/e1/f7/0a4467be0a56e80447c8529c9fce5b38eab4f513cb3d9bf82e7392a5696b/watchfiles-1.1.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3f7eb7da0eb23aa2ba036d4f616d46906013a68caf61b7fdbe42fc8b25132e77", size = 455425, upload-time = "2025-10-14T15:05:23.348Z" }, + { url = "https://files.pythonhosted.org/packages/8e/e0/82583485ea00137ddf69bc84a2db88bd92ab4a6e3c405e5fb878ead8d0e7/watchfiles-1.1.1-cp313-cp313t-musllinux_1_1_aarch64.whl", hash = "sha256:831a62658609f0e5c64178211c942ace999517f5770fe9436be4c2faeba0c0ef", size = 628826, upload-time = "2025-10-14T15:05:24.398Z" }, + { url = "https://files.pythonhosted.org/packages/28/9a/a785356fccf9fae84c0cc90570f11702ae9571036fb25932f1242c82191c/watchfiles-1.1.1-cp313-cp313t-musllinux_1_1_x86_64.whl", hash = "sha256:f9a2ae5c91cecc9edd47e041a930490c31c3afb1f5e6d71de3dc671bfaca02bf", size = 622208, upload-time = "2025-10-14T15:05:25.45Z" }, + { url = "https://files.pythonhosted.org/packages/c3/f4/0872229324ef69b2c3edec35e84bd57a1289e7d3fe74588048ed8947a323/watchfiles-1.1.1-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:d1715143123baeeaeadec0528bb7441103979a1d5f6fd0e1f915383fea7ea6d5", size = 404315, upload-time = "2025-10-14T15:05:26.501Z" }, + { url = "https://files.pythonhosted.org/packages/7b/22/16d5331eaed1cb107b873f6ae1b69e9ced582fcf0c59a50cd84f403b1c32/watchfiles-1.1.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:39574d6370c4579d7f5d0ad940ce5b20db0e4117444e39b6d8f99db5676c52fd", size = 390869, upload-time = "2025-10-14T15:05:27.649Z" }, + { url = "https://files.pythonhosted.org/packages/b2/7e/5643bfff5acb6539b18483128fdc0ef2cccc94a5b8fbda130c823e8ed636/watchfiles-1.1.1-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7365b92c2e69ee952902e8f70f3ba6360d0d596d9299d55d7d386df84b6941fb", size = 449919, upload-time = "2025-10-14T15:05:28.701Z" }, + { url = "https://files.pythonhosted.org/packages/51/2e/c410993ba5025a9f9357c376f48976ef0e1b1aefb73b97a5ae01a5972755/watchfiles-1.1.1-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bfff9740c69c0e4ed32416f013f3c45e2ae42ccedd1167ef2d805c000b6c71a5", size = 460845, upload-time = "2025-10-14T15:05:30.064Z" }, + { url = "https://files.pythonhosted.org/packages/8e/a4/2df3b404469122e8680f0fcd06079317e48db58a2da2950fb45020947734/watchfiles-1.1.1-cp314-cp314-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b27cf2eb1dda37b2089e3907d8ea92922b673c0c427886d4edc6b94d8dfe5db3", size = 489027, upload-time = "2025-10-14T15:05:31.064Z" }, + { url = "https://files.pythonhosted.org/packages/ea/84/4587ba5b1f267167ee715b7f66e6382cca6938e0a4b870adad93e44747e6/watchfiles-1.1.1-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:526e86aced14a65a5b0ec50827c745597c782ff46b571dbfe46192ab9e0b3c33", size = 595615, upload-time = "2025-10-14T15:05:32.074Z" }, + { url = "https://files.pythonhosted.org/packages/6a/0f/c6988c91d06e93cd0bb3d4a808bcf32375ca1904609835c3031799e3ecae/watchfiles-1.1.1-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:04e78dd0b6352db95507fd8cb46f39d185cf8c74e4cf1e4fbad1d3df96faf510", size = 474836, upload-time = "2025-10-14T15:05:33.209Z" }, + { url = "https://files.pythonhosted.org/packages/b4/36/ded8aebea91919485b7bbabbd14f5f359326cb5ec218cd67074d1e426d74/watchfiles-1.1.1-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5c85794a4cfa094714fb9c08d4a218375b2b95b8ed1666e8677c349906246c05", size = 455099, upload-time = "2025-10-14T15:05:34.189Z" }, + { url = "https://files.pythonhosted.org/packages/98/e0/8c9bdba88af756a2fce230dd365fab2baf927ba42cd47521ee7498fd5211/watchfiles-1.1.1-cp314-cp314-musllinux_1_1_aarch64.whl", hash = "sha256:74d5012b7630714b66be7b7b7a78855ef7ad58e8650c73afc4c076a1f480a8d6", size = 630626, upload-time = "2025-10-14T15:05:35.216Z" }, + { url = "https://files.pythonhosted.org/packages/2a/84/a95db05354bf2d19e438520d92a8ca475e578c647f78f53197f5a2f17aaf/watchfiles-1.1.1-cp314-cp314-musllinux_1_1_x86_64.whl", hash = "sha256:8fbe85cb3201c7d380d3d0b90e63d520f15d6afe217165d7f98c9c649654db81", size = 622519, upload-time = "2025-10-14T15:05:36.259Z" }, + { url = "https://files.pythonhosted.org/packages/1d/ce/d8acdc8de545de995c339be67711e474c77d643555a9bb74a9334252bd55/watchfiles-1.1.1-cp314-cp314-win32.whl", hash = "sha256:3fa0b59c92278b5a7800d3ee7733da9d096d4aabcfabb9a928918bd276ef9b9b", size = 272078, upload-time = "2025-10-14T15:05:37.63Z" }, + { url = "https://files.pythonhosted.org/packages/c4/c9/a74487f72d0451524be827e8edec251da0cc1fcf111646a511ae752e1a3d/watchfiles-1.1.1-cp314-cp314-win_amd64.whl", hash = "sha256:c2047d0b6cea13b3316bdbafbfa0c4228ae593d995030fda39089d36e64fc03a", size = 287664, upload-time = "2025-10-14T15:05:38.95Z" }, + { url = "https://files.pythonhosted.org/packages/df/b8/8ac000702cdd496cdce998c6f4ee0ca1f15977bba51bdf07d872ebdfc34c/watchfiles-1.1.1-cp314-cp314-win_arm64.whl", hash = "sha256:842178b126593addc05acf6fce960d28bc5fae7afbaa2c6c1b3a7b9460e5be02", size = 277154, upload-time = "2025-10-14T15:05:39.954Z" }, + { url = "https://files.pythonhosted.org/packages/47/a8/e3af2184707c29f0f14b1963c0aace6529f9d1b8582d5b99f31bbf42f59e/watchfiles-1.1.1-cp314-cp314t-macosx_10_12_x86_64.whl", hash = "sha256:88863fbbc1a7312972f1c511f202eb30866370ebb8493aef2812b9ff28156a21", size = 403820, upload-time = "2025-10-14T15:05:40.932Z" }, + { url = "https://files.pythonhosted.org/packages/c0/ec/e47e307c2f4bd75f9f9e8afbe3876679b18e1bcec449beca132a1c5ffb2d/watchfiles-1.1.1-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:55c7475190662e202c08c6c0f4d9e345a29367438cf8e8037f3155e10a88d5a5", size = 390510, upload-time = "2025-10-14T15:05:41.945Z" }, + { url = "https://files.pythonhosted.org/packages/d5/a0/ad235642118090f66e7b2f18fd5c42082418404a79205cdfca50b6309c13/watchfiles-1.1.1-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3f53fa183d53a1d7a8852277c92b967ae99c2d4dcee2bfacff8868e6e30b15f7", size = 448408, upload-time = "2025-10-14T15:05:43.385Z" }, + { url = "https://files.pythonhosted.org/packages/df/85/97fa10fd5ff3332ae17e7e40e20784e419e28521549780869f1413742e9d/watchfiles-1.1.1-cp314-cp314t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:6aae418a8b323732fa89721d86f39ec8f092fc2af67f4217a2b07fd3e93c6101", size = 458968, upload-time = "2025-10-14T15:05:44.404Z" }, + { url = "https://files.pythonhosted.org/packages/47/c2/9059c2e8966ea5ce678166617a7f75ecba6164375f3b288e50a40dc6d489/watchfiles-1.1.1-cp314-cp314t-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f096076119da54a6080e8920cbdaac3dbee667eb91dcc5e5b78840b87415bd44", size = 488096, upload-time = "2025-10-14T15:05:45.398Z" }, + { url = "https://files.pythonhosted.org/packages/94/44/d90a9ec8ac309bc26db808a13e7bfc0e4e78b6fc051078a554e132e80160/watchfiles-1.1.1-cp314-cp314t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:00485f441d183717038ed2e887a7c868154f216877653121068107b227a2f64c", size = 596040, upload-time = "2025-10-14T15:05:46.502Z" }, + { url = "https://files.pythonhosted.org/packages/95/68/4e3479b20ca305cfc561db3ed207a8a1c745ee32bf24f2026a129d0ddb6e/watchfiles-1.1.1-cp314-cp314t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a55f3e9e493158d7bfdb60a1165035f1cf7d320914e7b7ea83fe22c6023b58fc", size = 473847, upload-time = "2025-10-14T15:05:47.484Z" }, + { url = "https://files.pythonhosted.org/packages/4f/55/2af26693fd15165c4ff7857e38330e1b61ab8c37d15dc79118cdba115b7a/watchfiles-1.1.1-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c91ed27800188c2ae96d16e3149f199d62f86c7af5f5f4d2c61a3ed8cd3666c", size = 455072, upload-time = "2025-10-14T15:05:48.928Z" }, + { url = "https://files.pythonhosted.org/packages/66/1d/d0d200b10c9311ec25d2273f8aad8c3ef7cc7ea11808022501811208a750/watchfiles-1.1.1-cp314-cp314t-musllinux_1_1_aarch64.whl", hash = "sha256:311ff15a0bae3714ffb603e6ba6dbfba4065ab60865d15a6ec544133bdb21099", size = 629104, upload-time = "2025-10-14T15:05:49.908Z" }, + { url = "https://files.pythonhosted.org/packages/e3/bd/fa9bb053192491b3867ba07d2343d9f2252e00811567d30ae8d0f78136fe/watchfiles-1.1.1-cp314-cp314t-musllinux_1_1_x86_64.whl", hash = "sha256:a916a2932da8f8ab582f242c065f5c81bed3462849ca79ee357dd9551b0e9b01", size = 622112, upload-time = "2025-10-14T15:05:50.941Z" }, + { url = "https://files.pythonhosted.org/packages/a4/68/a7303a15cc797ab04d58f1fea7f67c50bd7f80090dfd7e750e7576e07582/watchfiles-1.1.1-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:c882d69f6903ef6092bedfb7be973d9319940d56b8427ab9187d1ecd73438a70", size = 409220, upload-time = "2025-10-14T15:05:51.917Z" }, + { url = "https://files.pythonhosted.org/packages/99/b8/d1857ce9ac76034c053fa7ef0e0ef92d8bd031e842ea6f5171725d31e88f/watchfiles-1.1.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:d6ff426a7cb54f310d51bfe83fe9f2bbe40d540c741dc974ebc30e6aa238f52e", size = 396712, upload-time = "2025-10-14T15:05:53.437Z" }, + { url = "https://files.pythonhosted.org/packages/41/7a/da7ada566f48beaa6a30b13335b49d1f6febaf3a5ddbd1d92163a1002cf4/watchfiles-1.1.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:79ff6c6eadf2e3fc0d7786331362e6ef1e51125892c75f1004bd6b52155fb956", size = 451462, upload-time = "2025-10-14T15:05:54.742Z" }, + { url = "https://files.pythonhosted.org/packages/e2/b2/7cb9e0d5445a8d45c4cccd68a590d9e3a453289366b96ff37d1075aaebef/watchfiles-1.1.1-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:c1f5210f1b8fc91ead1283c6fd89f70e76fb07283ec738056cf34d51e9c1d62c", size = 460811, upload-time = "2025-10-14T15:05:55.743Z" }, + { url = "https://files.pythonhosted.org/packages/04/9d/b07d4491dde6db6ea6c680fdec452f4be363d65c82004faf2d853f59b76f/watchfiles-1.1.1-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b9c4702f29ca48e023ffd9b7ff6b822acdf47cb1ff44cb490a3f1d5ec8987e9c", size = 490576, upload-time = "2025-10-14T15:05:56.983Z" }, + { url = "https://files.pythonhosted.org/packages/56/03/e64dcab0a1806157db272a61b7891b062f441a30580a581ae72114259472/watchfiles-1.1.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:acb08650863767cbc58bca4813b92df4d6c648459dcaa3d4155681962b2aa2d3", size = 597726, upload-time = "2025-10-14T15:05:57.986Z" }, + { url = "https://files.pythonhosted.org/packages/5c/8e/a827cf4a8d5f2903a19a934dcf512082eb07675253e154d4cd9367978a58/watchfiles-1.1.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:08af70fd77eee58549cd69c25055dc344f918d992ff626068242259f98d598a2", size = 474900, upload-time = "2025-10-14T15:05:59.378Z" }, + { url = "https://files.pythonhosted.org/packages/dc/a6/94fed0b346b85b22303a12eee5f431006fae6af70d841cac2f4403245533/watchfiles-1.1.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c3631058c37e4a0ec440bf583bc53cdbd13e5661bb6f465bc1d88ee9a0a4d02", size = 457521, upload-time = "2025-10-14T15:06:00.419Z" }, + { url = "https://files.pythonhosted.org/packages/c4/64/bc3331150e8f3c778d48a4615d4b72b3d2d87868635e6c54bbd924946189/watchfiles-1.1.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:cf57a27fb986c6243d2ee78392c503826056ffe0287e8794503b10fb51b881be", size = 632191, upload-time = "2025-10-14T15:06:01.621Z" }, + { url = "https://files.pythonhosted.org/packages/e4/84/f39e19549c2f3ec97225dcb2ceb9a7bb3c5004ed227aad1f321bf0ff2051/watchfiles-1.1.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:d7e7067c98040d646982daa1f37a33d3544138ea155536c2e0e63e07ff8a7e0f", size = 623923, upload-time = "2025-10-14T15:06:02.671Z" }, + { url = "https://files.pythonhosted.org/packages/0e/24/0759ae15d9a0c9c5fe946bd4cf45ab9e7bad7cfede2c06dc10f59171b29f/watchfiles-1.1.1-cp39-cp39-win32.whl", hash = "sha256:6c9c9262f454d1c4d8aaa7050121eb4f3aea197360553699520767daebf2180b", size = 274010, upload-time = "2025-10-14T15:06:03.779Z" }, + { url = "https://files.pythonhosted.org/packages/7e/3b/eb26cddd4dfa081e2bf6918be3b2fc05ee3b55c1d21331d5562ee0c6aaad/watchfiles-1.1.1-cp39-cp39-win_amd64.whl", hash = "sha256:74472234c8370669850e1c312490f6026d132ca2d396abfad8830b4f1c096957", size = 289090, upload-time = "2025-10-14T15:06:04.821Z" }, + { url = "https://files.pythonhosted.org/packages/ba/4c/a888c91e2e326872fa4705095d64acd8aa2fb9c1f7b9bd0588f33850516c/watchfiles-1.1.1-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:17ef139237dfced9da49fb7f2232c86ca9421f666d78c264c7ffca6601d154c3", size = 409611, upload-time = "2025-10-14T15:06:05.809Z" }, + { url = "https://files.pythonhosted.org/packages/1e/c7/5420d1943c8e3ce1a21c0a9330bcf7edafb6aa65d26b21dbb3267c9e8112/watchfiles-1.1.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:672b8adf25b1a0d35c96b5888b7b18699d27d4194bac8beeae75be4b7a3fc9b2", size = 396889, upload-time = "2025-10-14T15:06:07.035Z" }, + { url = "https://files.pythonhosted.org/packages/0c/e5/0072cef3804ce8d3aaddbfe7788aadff6b3d3f98a286fdbee9fd74ca59a7/watchfiles-1.1.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:77a13aea58bc2b90173bc69f2a90de8e282648939a00a602e1dc4ee23e26b66d", size = 451616, upload-time = "2025-10-14T15:06:08.072Z" }, + { url = "https://files.pythonhosted.org/packages/83/4e/b87b71cbdfad81ad7e83358b3e447fedd281b880a03d64a760fe0a11fc2e/watchfiles-1.1.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0b495de0bb386df6a12b18335a0285dda90260f51bdb505503c02bcd1ce27a8b", size = 458413, upload-time = "2025-10-14T15:06:09.209Z" }, + { url = "https://files.pythonhosted.org/packages/d3/8e/e500f8b0b77be4ff753ac94dc06b33d8f0d839377fee1b78e8c8d8f031bf/watchfiles-1.1.1-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:db476ab59b6765134de1d4fe96a1a9c96ddf091683599be0f26147ea1b2e4b88", size = 408250, upload-time = "2025-10-14T15:06:10.264Z" }, + { url = "https://files.pythonhosted.org/packages/bd/95/615e72cd27b85b61eec764a5ca51bd94d40b5adea5ff47567d9ebc4d275a/watchfiles-1.1.1-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:89eef07eee5e9d1fda06e38822ad167a044153457e6fd997f8a858ab7564a336", size = 396117, upload-time = "2025-10-14T15:06:11.28Z" }, + { url = "https://files.pythonhosted.org/packages/c9/81/e7fe958ce8a7fb5c73cc9fb07f5aeaf755e6aa72498c57d760af760c91f8/watchfiles-1.1.1-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ce19e06cbda693e9e7686358af9cd6f5d61312ab8b00488bc36f5aabbaf77e24", size = 450493, upload-time = "2025-10-14T15:06:12.321Z" }, + { url = "https://files.pythonhosted.org/packages/6e/d4/ed38dd3b1767193de971e694aa544356e63353c33a85d948166b5ff58b9e/watchfiles-1.1.1-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3e6f39af2eab0118338902798b5aa6664f46ff66bc0280de76fca67a7f262a49", size = 457546, upload-time = "2025-10-14T15:06:13.372Z" }, + { url = "https://files.pythonhosted.org/packages/00/db/38a2c52fdbbfe2fc7ffaaaaaebc927d52b9f4d5139bba3186c19a7463001/watchfiles-1.1.1-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:cdab464fee731e0884c35ae3588514a9bcf718d0e2c82169c1c4a85cc19c3c7f", size = 409210, upload-time = "2025-10-14T15:06:14.492Z" }, + { url = "https://files.pythonhosted.org/packages/d1/43/d7e8b71f6c21ff813ee8da1006f89b6c7fff047fb4c8b16ceb5e840599c5/watchfiles-1.1.1-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:3dbd8cbadd46984f802f6d479b7e3afa86c42d13e8f0f322d669d79722c8ec34", size = 397286, upload-time = "2025-10-14T15:06:16.177Z" }, + { url = "https://files.pythonhosted.org/packages/1f/5d/884074a5269317e75bd0b915644b702b89de73e61a8a7446e2b225f45b1f/watchfiles-1.1.1-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5524298e3827105b61951a29c3512deb9578586abf3a7c5da4a8069df247cccc", size = 451768, upload-time = "2025-10-14T15:06:18.266Z" }, + { url = "https://files.pythonhosted.org/packages/17/71/7ffcaa9b5e8961a25026058058c62ec8f604d2a6e8e1e94bee8a09e1593f/watchfiles-1.1.1-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4b943d3668d61cfa528eb949577479d3b077fd25fb83c641235437bc0b5bc60e", size = 458561, upload-time = "2025-10-14T15:06:19.323Z" }, +] + [[package]] name = "wcwidth" version = "0.2.14"