diff --git a/.github/workflows/code_checks.yml b/.github/workflows/code_checks.yml index 9eb62d17..4f00a541 100644 --- a/.github/workflows/code_checks.yml +++ b/.github/workflows/code_checks.yml @@ -57,3 +57,4 @@ jobs: virtual-environment: .venv/ ignore-vulns: | GHSA-xm59-rqc7-hhvf + GHSA-7gcm-g887-7qv7 diff --git a/README.md b/README.md index 6b1fcb1d..5f6113a8 100644 --- a/README.md +++ b/README.md @@ -11,20 +11,20 @@ This repository includes several modules, each showcasing a different aspect of **2. Frameworks: OpenAI Agents SDK** Showcases the use of the OpenAI agents SDK to reduce boilerplate and improve readability. -- **[2.1 ReAct Agent for RAG - OpenAI SDK](src/2_frameworks/1_react_rag/README.md)** +- **[2.1 ReAct Agent for RAG - OpenAI SDK](implementations/2_frameworks/1_react_rag/README.md)** Implements the same Reason-and-Act agent using the high-level abstractions provided by the OpenAI Agents SDK. This approach reduces boilerplate and improves readability. The use of langfuse for making the agent less of a black-box is also introduced in this module. -- **[2.2 Multi-agent Setup for Deep Research](src/2_frameworks/2_multi_agent/README.md)** +- **[2.2 Multi-agent Setup for Deep Research](implementations/2_frameworks/2_multi_agent/README.md)** Demo of a multi-agent architecture to improve efficiency on long-context inputs, reduce latency, and reduce LLM costs. Two versions are available- "efficient" and "verbose". For the build days, you should start from the "efficient" version as that provides greater flexibility and is easier to follow. **3. Evals: Automated Evaluation Pipelines** Contains scripts and utilities for evaluating agent performance using LLM-as-a-judge and synthetic data generation. Includes tools for uploading datasets, running evaluations, and integrating with [Langfuse](https://langfuse.com/) for traceability. -- **[3.1 LLM-as-a-Judge](src/3_evals/1_llm_judge/README.md)** +- **[3.1 LLM-as-a-Judge](implementations/3_evals/1_llm_judge/README.md)** Automated evaluation pipelines using LLM-as-a-judge with Langfuse integration. -- **[3.2 Evaluation on Synthetic Dataset](src/3_evals/2_synthetic_data/README.md)** +- **[3.2 Evaluation on Synthetic Dataset](implementations/3_evals/2_synthetic_data/README.md)** Showcases the generation of synthetic evaluation data for testing agents. We also provide "basic" no-framework implementations. These are meant to showcase how agents work behind the scene and are excessively verbose in the implementation. You should not use these as the basis for real projects. @@ -32,10 +32,10 @@ We also provide "basic" no-framework implementations. These are meant to showcas **1. Basics: Reason-and-Act RAG** A minimal Reason-and-Act (ReAct) agent for knowledge retrieval, implemented without any agent framework. -- **[1.0 Search Demo](src/1_basics/0_search_demo/README.md)** +- **[1.0 Search Demo](implementations/1_basics/0_search_demo/README.md)** A simple demo showing the capabilities (and limitations) of a knowledgebase search. -- **[1.1 ReAct Agent for RAG](src/1_basics/1_react_rag/README.md)** +- **[1.1 ReAct Agent for RAG](implementations/1_basics/1_react_rag/README.md)** Basic ReAct agent for step-by-step retrieval and answer generation. ## Getting Started @@ -48,7 +48,7 @@ In that case you can verify that the API keys work by running integration tests uv run --env-file .env pytest -sv tests/tool_tests/test_integration.py ``` -## Reference Implementations +## Running the Reference Implementations For "Gradio App" reference implementations, running the script would print out a "public URL" ending in `gradio.live` (might take a few seconds to appear.) To access the gradio app with the full streaming capabilities, copy and paste this `gradio.live` URL into a new browser tab. @@ -74,7 +74,7 @@ These warnings can be safely ignored, as they are the result of a bug in the ups Interactive knowledge base demo. Access the gradio interface in your browser to see if your knowledge base meets your expectations. ```bash -uv run --env-file .env gradio src/1_basics/0_search_demo/app.py +uv run --env-file .env gradio implementations/1_basics/0_search_demo/app.py ``` Basic Reason-and-Act Agent- for demo purposes only. @@ -82,8 +82,8 @@ Basic Reason-and-Act Agent- for demo purposes only. As noted above, these are unnecessarily verbose for real applications. ```bash -# uv run --env-file .env src/1_basics/1_react_rag/cli.py -# uv run --env-file .env gradio src/1_basics/1_react_rag/app.py +# uv run --env-file .env implementations/1_basics/1_react_rag/cli.py +# uv run --env-file .env gradio implementations/1_basics/1_react_rag/app.py ``` ### 2. Frameworks @@ -91,21 +91,21 @@ As noted above, these are unnecessarily verbose for real applications. Reason-and-Act Agent without the boilerplate- using the OpenAI Agent SDK. ```bash -uv run --env-file .env src/2_frameworks/1_react_rag/cli.py -uv run --env-file .env gradio src/2_frameworks/1_react_rag/langfuse_gradio.py +uv run --env-file .env implementations/2_frameworks/1_react_rag/cli.py +uv run --env-file .env gradio implementations/2_frameworks/1_react_rag/langfuse_gradio.py ``` Multi-agent examples, also via the OpenAI Agent SDK. ```bash -uv run --env-file .env gradio src/2_frameworks/2_multi_agent/efficient.py +uv run --env-file .env gradio implementations/2_frameworks/2_multi_agent/efficient.py # Verbose option - greater control over the agent flow, but less flexible. -# uv run --env-file .env gradio src/2_frameworks/2_multi_agent/verbose.py +# uv run --env-file .env gradio implementations/2_frameworks/2_multi_agent/verbose.py ``` -Python Code Interpreter demo- using the OpenAI Agent SDK, E2B for secure code sandbox, and LangFuse for observability. Refer to [src/2_frameworks/3_code_interpreter/README.md](src/2_frameworks/3_code_interpreter/README.md) for details. +Python Code Interpreter demo- using the OpenAI Agent SDK, E2B for secure code sandbox, and LangFuse for observability. Refer to [implementations/2_frameworks/3_code_interpreter/README.md](implementations/2_frameworks/3_code_interpreter/README.md) for details. -MCP server integration example also via OpenAI Agents SDK with Gradio and Langfuse tracing. Refer to [src/2_frameworks/4_mcp/README.md](src/2_frameworks/4_mcp/README.md) for more details. +MCP server integration example also via OpenAI Agents SDK with Gradio and Langfuse tracing. Refer to [implementations/2_frameworks/4_mcp/README.md](implementations/2_frameworks/4_mcp/README.md) for more details. ### 3. Evals @@ -113,9 +113,9 @@ Synthetic data. ```bash uv run --env-file .env \ --m src.3_evals.2_synthetic_data.synthesize_data \ +-m implementations.3_evals.2_synthetic_data.synthesize_data \ --source_dataset hf://vector-institute/hotpotqa@d997ecf:train \ ---langfuse_dataset_name search-dataset-synthetic-20250609 \ +--langfuse_dataset_name search-dataset-synthetic \ --limit 18 ``` @@ -125,15 +125,15 @@ Quantify embedding diversity of synthetic data # Baseline: "Real" dataset uv run \ --env-file .env \ --m src.3_evals.2_synthetic_data.annotate_diversity \ +-m implementations.3_evals.2_synthetic_data.annotate_diversity \ --langfuse_dataset_name search-dataset \ --run_name cosine_similarity_bge_m3 # Synthetic dataset uv run \ --env-file .env \ --m src.3_evals.2_synthetic_data.annotate_diversity \ ---langfuse_dataset_name search-dataset-synthetic-20250609 \ +-m implementations.3_evals.2_synthetic_data.annotate_diversity \ +--langfuse_dataset_name search-dataset-synthetic \ --run_name cosine_similarity_bge_m3 ``` @@ -142,7 +142,7 @@ Visualize embedding diversity of synthetic data ```bash uv run \ --env-file .env \ -gradio src/3_evals/2_synthetic_data/gradio_visualize_diversity.py +gradio implementations/3_evals/2_synthetic_data/gradio_visualize_diversity.py ``` Run LLM-as-a-judge Evaluation on synthetic data @@ -150,8 +150,8 @@ Run LLM-as-a-judge Evaluation on synthetic data ```bash uv run \ --env-file .env \ --m src.3_evals.1_llm_judge.run_eval \ ---langfuse_dataset_name search-dataset-synthetic-20250609 \ +-m implementations.3_evals.1_llm_judge.run_eval \ +--langfuse_dataset_name search-dataset-synthetic \ --run_name enwiki_weaviate \ --limit 18 ``` diff --git a/aieng-agents/.python-version b/aieng-agents/.python-version new file mode 100644 index 00000000..e4fba218 --- /dev/null +++ b/aieng-agents/.python-version @@ -0,0 +1 @@ +3.12 diff --git a/aieng-agents/README.md b/aieng-agents/README.md new file mode 100644 index 00000000..8c305430 --- /dev/null +++ b/aieng-agents/README.md @@ -0,0 +1,353 @@ +# aieng-agents + +A utility library for building AI agent applications with support for knowledge bases, code interpreter, web search, and observability. Built for the Vector Institute Agents Bootcamp +by the AI Engineering team. + +## Features + +### 🤖 Agent Tools + +- **Code Interpreter** - Execute Python code in isolated E2B sandboxes with file upload support +- **Gemini Grounding with Google Search** - Web search capabilities with citation tracking +- **Weaviate Knowledge Base** - Vector database integration for RAG applications +- **News Events** - Fetch structured current events from Wikipedia + +### 📊 Data Processing + +- **PDF to Dataset** - Convert PDF documents to HuggingFace datasets using multimodal OCR +- **Dataset Chunking** - Token-aware text chunking for embedding models +- **Dataset Loading** - Unified interface for loading datasets from multiple sources + +### 🔧 Utilities + +- **Async Client Manager** - Lifecycle management for async clients (OpenAI, Weaviate) +- **Progress Tracking** - Rich progress bars for async operations with rate limiting +- **Gradio Integration** - Message format conversion between Gradio and OpenAI SDK +- **Langfuse Integration** - OpenTelemetry-based observability and tracing +- **Environment Configuration** - Type-safe environment variable management with Pydantic +- **Session Management** - Persistent conversation sessions with SQLite backend + +## Installation + +### Using uv (recommended) + +```bash +uv pip install aieng-agents +``` + +### Using pip + +```bash +pip install aieng-agents +``` + +## Quick Start + +### Environment Setup + +Create a `.env` file with your API keys: + +```env +# Required for most features +OPENAI_API_KEY=your_openai_key +# or +GEMINI_API_KEY=your_gemini_key + +# For Weaviate knowledge base +WEAVIATE_API_KEY=your_weaviate_key +WEAVIATE_HTTP_HOST=your_instance.weaviate.cloud +WEAVIATE_GRPC_HOST=grpc-your_instance.weaviate.cloud + +# For code interpreter (optional) +E2B_API_KEY=your_e2b_key + +# For Langfuse observability (optional) +LANGFUSE_PUBLIC_KEY=pk-lf-xxx +LANGFUSE_SECRET_KEY=sk-lf-xxx + +# For embedding models (optional) +EMBEDDING_API_KEY=your_embedding_key +EMBEDDING_BASE_URL=https://your-embedding-service +``` + +### Basic Usage Examples + +#### Using Tools with OpenAI Agents SDK + +```python +from aieng.agents.tools import ( + CodeInterpreter, + AsyncWeaviateKnowledgeBase, + get_weaviate_async_client, +) +from aieng.agents import AsyncClientManager +import agents + +# Initialize client manager +manager = AsyncClientManager() + +# Create an agent with tools +agent = agents.Agent( + name="Research Assistant", + instructions="Help users with code and research questions.", + tools=[ + agents.function_tool(manager.knowledgebase.search_knowledgebase), + agents.function_tool(CodeInterpreter().run_code), + ], + model=agents.OpenAIChatCompletionsModel( + model="gpt-4o", + openai_client=manager.openai_client, + ), +) + +# Run the agent +response = await agents.Runner.run( + agent, + input="Search for information about transformers and create a visualization." +) + +# Clean up +await manager.close() +``` + +#### Using the Code Interpreter + +```python +from aieng.agents.tools import CodeInterpreter + +interpreter = CodeInterpreter( + template=" agents.SQLiteSession: """Get existing session or create a new one for conversation persistence.""" diff --git a/src/utils/async_utils.py b/aieng-agents/aieng/agents/async_utils.py similarity index 51% rename from src/utils/async_utils.py rename to aieng-agents/aieng/agents/async_utils.py index 85150833..7c653719 100644 --- a/src/utils/async_utils.py +++ b/aieng-agents/aieng/agents/async_utils.py @@ -1,8 +1,9 @@ """Utils for async workflows.""" import asyncio +import atexit import types -from typing import Any, Awaitable, Callable, Coroutine, Sequence, TypeVar +from typing import Any, Awaitable, Callable, Coroutine, Protocol, Sequence, TypeVar from rich.progress import ( BarColumn, @@ -16,6 +17,60 @@ T = TypeVar("T") +class AsyncCloseable(Protocol): + """Protocol for objects with an async close method.""" + + async def close(self) -> None: + """Close the resource asynchronously.""" + ... + + +def register_async_cleanup(*resources: AsyncCloseable) -> None: + """Register async resources for cleanup at exit. + + Safely handles cleanup whether or not an event loop is running, + making it suitable for Gradio apps and other async frameworks. + + Parameters + ---------- + *resources : AsyncCloseable + One or more objects with an async `close()` method to clean up at exit. + + Examples + -------- + >>> client_manager = AsyncClientManager() + >>> register_async_cleanup(client_manager) + >>> # Resources will be closed when the program exits + """ + + def cleanup() -> None: + """Cleanup function that safely closes async resources.""" + try: + # Try to get the current running event loop + loop = asyncio.get_running_loop() + except RuntimeError: + # No running loop, safe to create a new one with asyncio.run() + async def close_all() -> None: + await asyncio.gather( + *(resource.close() for resource in resources), + return_exceptions=True, + ) + + asyncio.run(close_all()) + else: + # There's a running loop, schedule the cleanup as a task + # This will execute after the current event loop iteration completes + async def close_all() -> None: + await asyncio.gather( + *(resource.close() for resource in resources), + return_exceptions=True, + ) + + loop.create_task(close_all()) + + atexit.register(cleanup) + + async def indexed(index: int, coro: Coroutine[None, None, T]) -> tuple[int, T]: """Return (index, await coro).""" return index, (await coro) @@ -69,3 +124,6 @@ async def gather_with_progress( # At this point, every slot in `results` is guaranteed to be non‐None # so we can safely cast it back to List[T] return results # type: ignore + + +__all__ = ["gather_with_progress", "rate_limited", "register_async_cleanup"] diff --git a/src/utils/client_manager.py b/aieng-agents/aieng/agents/client_manager.py similarity index 95% rename from src/utils/client_manager.py rename to aieng-agents/aieng/agents/client_manager.py index b2a5b27f..916e1a83 100644 --- a/src/utils/client_manager.py +++ b/aieng-agents/aieng/agents/client_manager.py @@ -5,12 +5,14 @@ hot-reload process. """ +from aieng.agents.env_vars import Configs +from aieng.agents.tools.weaviate_kb import ( + AsyncWeaviateKnowledgeBase, + get_weaviate_async_client, +) from openai import AsyncOpenAI from weaviate.client import WeaviateAsyncClient -from .env_vars import Configs -from .tools.kb_weaviate import AsyncWeaviateKnowledgeBase, get_weaviate_async_client - class AsyncClientManager: """Manages async client lifecycle with lazy initialization and cleanup. diff --git a/aieng-agents/aieng/agents/data/__init__.py b/aieng-agents/aieng/agents/data/__init__.py new file mode 100644 index 00000000..1bf807ab --- /dev/null +++ b/aieng-agents/aieng/agents/data/__init__.py @@ -0,0 +1,7 @@ +"""Utilities for handling data.""" + +from aieng.agents.data.batching import create_batches +from aieng.agents.data.load_dataset import get_dataset, get_dataset_url_hash + + +__all__ = ["create_batches", "get_dataset", "get_dataset_url_hash"] diff --git a/src/utils/data/batching.py b/aieng-agents/aieng/agents/data/batching.py similarity index 100% rename from src/utils/data/batching.py rename to aieng-agents/aieng/agents/data/batching.py diff --git a/src/utils/data/chunk_hf_dataset.py b/aieng-agents/aieng/agents/data/chunk_hf_dataset.py similarity index 98% rename from src/utils/data/chunk_hf_dataset.py rename to aieng-agents/aieng/agents/data/chunk_hf_dataset.py index 3e0279a7..19f0737b 100644 --- a/src/utils/data/chunk_hf_dataset.py +++ b/aieng-agents/aieng/agents/data/chunk_hf_dataset.py @@ -1,4 +1,4 @@ -"""Script to chunk text data from a HuggingFace dataset.""" +"""Script for chunking text data from a HuggingFace dataset.""" import os from collections import defaultdict diff --git a/src/utils/data/load_dataset.py b/aieng-agents/aieng/agents/data/load_dataset.py similarity index 100% rename from src/utils/data/load_dataset.py rename to aieng-agents/aieng/agents/data/load_dataset.py diff --git a/src/utils/data/pdf_to_hf_dataset.py b/aieng-agents/aieng/agents/data/pdf_to_hf_dataset.py similarity index 99% rename from src/utils/data/pdf_to_hf_dataset.py rename to aieng-agents/aieng/agents/data/pdf_to_hf_dataset.py index 931bcaa9..68c683c5 100644 --- a/src/utils/data/pdf_to_hf_dataset.py +++ b/aieng-agents/aieng/agents/data/pdf_to_hf_dataset.py @@ -13,11 +13,10 @@ Examples -------- Transcribe a single PDF and save to ``hf_dataset``: - uv run --env-file .env src/utils/data/pdf_to_hf_dataset.py \ - --input-path ./docs/example.pdf + uv run --env-file .env pdf_to_hf_dataset.py --input-path ./docs/example.pdf Transcribe a folder recursively with a smaller DPI and a custom output: - uv run --env-file .env src/utils/data/pdf_to_hf_dataset.py \ + uv run --env-file .env pdf_to_hf_dataset.py \ --input-path ./docs --recursive --dpi 150 --output-dir ./out_dataset Notes @@ -789,6 +788,8 @@ def main( hub_repo_id: str | None, ) -> None: """Convert PDFs to a chunked HuggingFace dataset.""" + load_dotenv() + if chunk_overlap >= chunk_size: raise ValueError("chunk_overlap must be smaller than chunk_size.") @@ -847,6 +848,4 @@ def main( if __name__ == "__main__": - load_dotenv() - main() diff --git a/src/utils/env_vars.py b/aieng-agents/aieng/agents/env_vars.py similarity index 98% rename from src/utils/env_vars.py rename to aieng-agents/aieng/agents/env_vars.py index edc78582..61686990 100644 --- a/src/utils/env_vars.py +++ b/aieng-agents/aieng/agents/env_vars.py @@ -67,7 +67,7 @@ class Configs(BaseSettings): Examples -------- - >>> from src.utils.env_vars import Configs + >>> from implementations.utils.env_vars import Configs >>> config = Configs() >>> print(config.default_planner_model) 'gemini-2.5-pro' diff --git a/aieng-agents/aieng/agents/gradio/__init__.py b/aieng-agents/aieng/agents/gradio/__init__.py new file mode 100644 index 00000000..d79a83f3 --- /dev/null +++ b/aieng-agents/aieng/agents/gradio/__init__.py @@ -0,0 +1,23 @@ +"""Utilities for managing Gradio interface.""" + +import gradio as gr + + +def get_common_gradio_config() -> dict[str, gr.Component]: + """Get common Gradio components for agent demos. + + This includes a chatbot for displaying messages, a textbox for user input, + and a hidden state component for maintaining session state across turns. + + Returns + ------- + dict[str, gr.Component] + A dictionary containing the common Gradio components. + """ + return { + "chatbot": gr.Chatbot(height=600), + "textbox": gr.Textbox(lines=1, placeholder="Enter your prompt"), + # Additional input to maintain session state across multiple turns + # NOTE: Examples must be a list of lists when additional inputs are provided + "additional_inputs": gr.State(value={}, render=False), + } diff --git a/src/utils/gradio/messages.py b/aieng-agents/aieng/agents/gradio/messages.py similarity index 98% rename from src/utils/gradio/messages.py rename to aieng-agents/aieng/agents/gradio/messages.py index 096217fe..5a049dab 100644 --- a/src/utils/gradio/messages.py +++ b/aieng-agents/aieng/agents/gradio/messages.py @@ -6,19 +6,27 @@ from typing import TYPE_CHECKING import gradio as gr -from agents import StreamEvent, stream_events -from agents.items import MessageOutputItem, RunItem, ToolCallItem, ToolCallOutputItem from gradio.components.chatbot import ChatMessage, MetadataDict from openai.types.responses import ResponseFunctionToolCall, ResponseOutputText from openai.types.responses.response_completed_event import ResponseCompletedEvent from openai.types.responses.response_output_message import ResponseOutputMessage from PIL import Image +from agents import StreamEvent, stream_events +from agents.items import MessageOutputItem, RunItem, ToolCallItem, ToolCallOutputItem + if TYPE_CHECKING: from openai.types.chat import ChatCompletionMessageParam +__all__ = [ + "gradio_messages_to_oai_chat", + "oai_agent_items_to_gradio_messages", + "oai_agent_stream_to_gradio_messages", +] + + def gradio_messages_to_oai_chat( messages: list[ChatMessage | dict], ) -> list["ChatCompletionMessageParam"]: @@ -169,7 +177,6 @@ def oai_agent_stream_to_gradio_messages( if isinstance(stream_event, stream_events.RawResponsesStreamEvent): data = stream_event.data if isinstance(data, ResponseCompletedEvent): - print(stream_event) # The completed event may contain multiple output messages, # including tool calls and final outputs. # If there is at least one tool call, we mark the response as a thought. @@ -210,7 +217,6 @@ def oai_agent_stream_to_gradio_messages( item = stream_event.item if name == "tool_output" and isinstance(item, ToolCallOutputItem): - print(stream_event) text_content, images = _process_tool_output_for_images(item.output) output.append( diff --git a/aieng-agents/aieng/agents/langfuse/__init__.py b/aieng-agents/aieng/agents/langfuse/__init__.py new file mode 100644 index 00000000..daea8375 --- /dev/null +++ b/aieng-agents/aieng/agents/langfuse/__init__.py @@ -0,0 +1,13 @@ +"""Utilities for Langfuse integration.""" + +from aieng.agents.langfuse.oai_sdk_setup import setup_langfuse_tracer +from aieng.agents.langfuse.otlp_env_setup import set_up_langfuse_otlp_env_vars +from aieng.agents.langfuse.shared_client import flush_langfuse, langfuse_client + + +__all__ = [ + "flush_langfuse", + "langfuse_client", + "set_up_langfuse_otlp_env_vars", + "setup_langfuse_tracer", +] diff --git a/src/utils/langfuse/oai_sdk_setup.py b/aieng-agents/aieng/agents/langfuse/oai_sdk_setup.py similarity index 94% rename from src/utils/langfuse/oai_sdk_setup.py rename to aieng-agents/aieng/agents/langfuse/oai_sdk_setup.py index 8432cc35..53c5c572 100644 --- a/src/utils/langfuse/oai_sdk_setup.py +++ b/aieng-agents/aieng/agents/langfuse/oai_sdk_setup.py @@ -6,13 +6,12 @@ import logfire import nest_asyncio +from aieng.agents.langfuse.otlp_env_setup import set_up_langfuse_otlp_env_vars from opentelemetry import trace from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter from opentelemetry.sdk.trace import TracerProvider from opentelemetry.sdk.trace.export import SimpleSpanProcessor -from .otlp_env_setup import set_up_langfuse_otlp_env_vars - def configure_oai_agents_sdk(service_name: str) -> None: """Register Langfuse as tracing provider for OAI Agents SDK.""" diff --git a/src/utils/langfuse/otlp_env_setup.py b/aieng-agents/aieng/agents/langfuse/otlp_env_setup.py similarity index 89% rename from src/utils/langfuse/otlp_env_setup.py rename to aieng-agents/aieng/agents/langfuse/otlp_env_setup.py index 68d828ff..e1346ea5 100644 --- a/src/utils/langfuse/otlp_env_setup.py +++ b/aieng-agents/aieng/agents/langfuse/otlp_env_setup.py @@ -4,10 +4,10 @@ import logging import os -from ..env_vars import Configs +from aieng.agents.env_vars import Configs -def set_up_langfuse_otlp_env_vars(): +def set_up_langfuse_otlp_env_vars() -> None: """Set up environment variables for Langfuse OpenTelemetry integration. OTLP = OpenTelemetry Protocol. diff --git a/src/utils/langfuse/shared_client.py b/aieng-agents/aieng/agents/langfuse/shared_client.py similarity index 77% rename from src/utils/langfuse/shared_client.py rename to aieng-agents/aieng/agents/langfuse/shared_client.py index bc2c16ac..55e15062 100644 --- a/src/utils/langfuse/shared_client.py +++ b/aieng-agents/aieng/agents/langfuse/shared_client.py @@ -1,24 +1,20 @@ """Shared instance of langfuse client.""" -from os import getenv - +from aieng.agents.env_vars import Configs from langfuse import Langfuse from rich.progress import Progress, SpinnerColumn, TextColumn -from ..env_vars import Configs - -__all__ = ["langfuse_client"] +__all__ = ["flush_langfuse", "langfuse_client"] config = Configs() -assert getenv("LANGFUSE_PUBLIC_KEY") is not None langfuse_client = Langfuse( public_key=config.langfuse_public_key, secret_key=config.langfuse_secret_key ) -def flush_langfuse(client: "Langfuse | None" = None): +def flush_langfuse(client: "Langfuse | None" = None) -> None: """Flush shared LangFuse Client. Rich Progress included.""" if client is None: client = langfuse_client diff --git a/src/utils/logging.py b/aieng-agents/aieng/agents/logging.py similarity index 96% rename from src/utils/logging.py rename to aieng-agents/aieng/agents/logging.py index a0264e23..3b7d37dd 100644 --- a/src/utils/logging.py +++ b/aieng-agents/aieng/agents/logging.py @@ -4,6 +4,9 @@ import warnings +__all__ = ["set_up_logging"] + + class IgnoreOpenAI401Filter(logging.Filter): """ A logging filter that excludes specific OpenAI client error messages. diff --git a/src/utils/pretty_printing.py b/aieng-agents/aieng/agents/pretty_printing.py similarity index 93% rename from src/utils/pretty_printing.py rename to aieng-agents/aieng/agents/pretty_printing.py index fdb55571..e8320a13 100644 --- a/src/utils/pretty_printing.py +++ b/aieng-agents/aieng/agents/pretty_printing.py @@ -6,6 +6,9 @@ import pydantic +__all__ = ["pretty_print"] + + def _serializer(item: Any) -> dict[str, Any] | str: """Serialize using heuristics.""" if isinstance(item, pydantic.BaseModel): diff --git a/aieng-agents/aieng/agents/prompts.py b/aieng-agents/aieng/agents/prompts.py new file mode 100644 index 00000000..7d0cb56d --- /dev/null +++ b/aieng-agents/aieng/agents/prompts.py @@ -0,0 +1,123 @@ +"""Centralized location for all system prompts.""" + +REACT_INSTRUCTIONS = """\ +Answer the question using the search tool. \ +EACH TIME before invoking the function, you must explain your reasons for doing so. \ +Be sure to mention the sources in your response. \ +If the search tool did not return intended results, try again. \ +For best performance, divide complex queries into simpler sub-queries. \ +Do not make up information. \ +For facts that might change over time, you must use the search tool to retrieve the \ +most up-to-date information. +""" + +CODE_INTERPRETER_INSTRUCTIONS = """\ +The `code_interpreter` tool executes Python commands. \ +Please note that data is not persisted. Each time you invoke this tool, \ +you will need to run import and define all variables from scratch. + +You can access the local filesystem using this tool. \ +Instead of asking the user for file inputs, you should try to find the file \ +using this tool. + +Recommended packages: Pandas, Numpy, SymPy, Scikit-learn, Matplotlib, Seaborn. + +Use Matplotlib to create visualizations. Make sure to call `plt.show()` so that +the plot is captured and returned to the user. + +You can also run Jupyter-style shell commands (e.g., `!pip freeze`) +but you won't be able to install packages. +""" + +SEARCH_AGENT_INSTRUCTIONS = """\ +You are a search agent. You receive a single search query as input. \ +Use the search tool to perform a search, then produce a concise \ +'search summary' of the key findings. \ +For every fact you include in the summary, ALWAYS include a citation \ +both in-line and at the end of the summary as a numbered list. The \ +citation at the end should include relevant metadata from the search \ +results. Do NOT return raw search results. +""" + +WIKI_SEARCH_PLANNER_INSTRUCTIONS = """\ +You are a research planner. \ +Given a user's query, produce a list of search terms that can be used to retrieve +relevant information from a knowledge base to answer the question. \ +As you are not able to clarify from the user what they are looking for, \ +your search terms should be broad and cover various aspects of the query. \ +Output up to 10 search terms to query the knowledge base. \ +Note that the knowledge base is a Wikipedia dump and cuts off at May 2025. +""" + +KB_RESEARCHER_INSTRUCTIONS = """\ +You are a research assistant with access to a knowledge base. \ +Given a potentially broad search term, your task is to use the search tool to \ +retrieve relevant information from the knowledge base and produce a short \ +summary of at most 300 words. You must pass the initial search term directly to \ +the search tool without any modifications and, only if necessary, refine your \ +search based on the results you get back. Your summary must be based solely on \ +a synthesis of all the search results and should not include any information that \ +is not present in the search results. For every fact you include in the summary, \ +ALWAYS include a citation both in-line and at the end of the summary as a numbered \ +list. The citation at the end should include relevant metadata from the search \ +results. Do NOT return raw search results. +""" + +WRITER_INSTRUCTIONS = """\ +You are an expert at synthesizing information and writing coherent reports. \ +Given a user's query and a set of search summaries, synthesize these into a \ +coherent report that answers the user's question. The length of the report should be \ +proportional to the complexity of the question. For queries that are more complex, \ +ensure that the report is well-structured, with clear sections and headings where \ +appropriate. Make sure to use the citations from the search summaries to back up \ +any factual claims you make. \ +Do not make up any information outside of the search summaries. +""" + +WIKI_AND_WEB_ORCHESTRATOR_INSTRUCTIONS = """\ +You are a deep research agent and your goal is to conduct in-depth, multi-turn +research by breaking down complex queries, using the provided tools, and +synthesizing the information into a comprehensive report. + +You have access to the following tools: +1. 'search_knowledgebase' - use this tool to search for information in a + knowledge base. The knowledge base reflects a subset of Wikipedia as + of May 2025. +2. 'get_web_search_grounded_response' - use this tool for current events, + news, fact-checking or when the information in the knowledge base is + not sufficient to answer the question. + +Both tools will not return raw search results or the sources themselves. +Instead, they will return a concise summary of the key findings, along +with the sources used to generate the summary. + +For best performance, divide complex queries into simpler sub-queries +Before calling either tool, always explain your reasoning for doing so. + +Note that the 'get_web_search_grounded_response' tool will expand the query +into multiple search queries and execute them. It will also return the +queries it executed. Do not repeat them. + +**Routing Guidelines:** +- When answering a question, you should first try to use the 'search_knowledgebase' +tool, unless the question requires recent information after May 2025 or +has explicit recency cues. +- If either tool returns insufficient information for a given query, try +reformulating or using the other tool. You can call either tool multiple +times to get the information you need to answer the user's question. + +**Guidelines for synthesis** +- After collecting results, write the final answer from your own synthesis. +- Add a "Sources" section listing unique sources, formatted as: + [1] Publisher - URL + [2] Wikipedia: (Section:
) +Order by first mention in your text. Every factual sentence in your final +response must map to at least one source. +- If web and knowledge base disagree, surface the disagreement and prefer sources +with newer publication dates. +- Do not invent URLs or sources. +- If both tools fail, say so and suggest 2–3 refined queries. + +Be sure to mention the sources in your response, including the URL if available, +and do not make up information. +""" diff --git a/src/utils/web_search/__init__.py b/aieng-agents/aieng/agents/py.typed similarity index 100% rename from src/utils/web_search/__init__.py rename to aieng-agents/aieng/agents/py.typed diff --git a/src/utils/tools/README.md b/aieng-agents/aieng/agents/tools/README.md similarity index 67% rename from src/utils/tools/README.md rename to aieng-agents/aieng/agents/tools/README.md index bc4816e6..6b7805d7 100644 --- a/src/utils/tools/README.md +++ b/aieng-agents/aieng/agents/tools/README.md @@ -4,5 +4,5 @@ This module contains various tools for LLM agents. ```bash # Tool for getting a list of recent news headlines from enwiki -uv run --env-file .env python3 src/utils/tools/news_events.py +uv run --env-file .env python3 aieng-agents/aieng/tools/news_events.py ``` diff --git a/aieng-agents/aieng/agents/tools/__init__.py b/aieng-agents/aieng/agents/tools/__init__.py new file mode 100644 index 00000000..79a00cc5 --- /dev/null +++ b/aieng-agents/aieng/agents/tools/__init__.py @@ -0,0 +1,21 @@ +"""Reusable tools for AI agents.""" + +from aieng.agents.tools.code_interpreter import CodeInterpreter, CodeInterpreterOutput +from aieng.agents.tools.gemini_grounding import GeminiGroundingWithGoogleSearch +from aieng.agents.tools.news_events import CurrentEvents, NewsEvent, get_news_events +from aieng.agents.tools.weaviate_kb import ( + AsyncWeaviateKnowledgeBase, + get_weaviate_async_client, +) + + +__all__ = [ + "CodeInterpreter", + "CodeInterpreterOutput", + "GeminiGroundingWithGoogleSearch", + "AsyncWeaviateKnowledgeBase", + "get_weaviate_async_client", + "CurrentEvents", + "NewsEvent", + "get_news_events", +] diff --git a/src/utils/tools/code_interpreter.py b/aieng-agents/aieng/agents/tools/code_interpreter.py similarity index 97% rename from src/utils/tools/code_interpreter.py rename to aieng-agents/aieng/agents/tools/code_interpreter.py index 6057e545..3565b510 100644 --- a/src/utils/tools/code_interpreter.py +++ b/aieng-agents/aieng/agents/tools/code_interpreter.py @@ -4,11 +4,13 @@ from pathlib import Path from typing import Sequence +from aieng.agents.async_utils import gather_with_progress from e2b_code_interpreter import AsyncSandbox from e2b_code_interpreter.models import serialize_results from pydantic import BaseModel -from ..async_utils import gather_with_progress + +__all__ = ["CodeInterpreter", "CodeInterpreterOutput"] class _CodeInterpreterOutputError(BaseModel): diff --git a/src/utils/tools/gemini_grounding.py b/aieng-agents/aieng/agents/tools/gemini_grounding.py similarity index 99% rename from src/utils/tools/gemini_grounding.py rename to aieng-agents/aieng/agents/tools/gemini_grounding.py index 5bc95be4..af6a8365 100644 --- a/src/utils/tools/gemini_grounding.py +++ b/aieng-agents/aieng/agents/tools/gemini_grounding.py @@ -13,6 +13,8 @@ RETRYABLE_STATUS = {429, 500, 502, 503, 504} +__all__ = ["GeminiGroundingWithGoogleSearch", "GroundedResponse"] + class ModelSettings(BaseModel): """Configuration for the Gemini model used for web search.""" diff --git a/src/utils/tools/news_events.py b/aieng-agents/aieng/agents/tools/news_events.py similarity index 98% rename from src/utils/tools/news_events.py rename to aieng-agents/aieng/agents/tools/news_events.py index 06a5a947..fae5d9e8 100644 --- a/src/utils/tools/news_events.py +++ b/aieng-agents/aieng/agents/tools/news_events.py @@ -1,8 +1,6 @@ #!/usr/bin/env python3 """Fetch and parse Wikipedia Current Events into structured data using Pydantic.""" -from __future__ import annotations - import argparse import asyncio import random @@ -16,6 +14,9 @@ from rich.progress import Progress, SpinnerColumn, TextColumn, TimeElapsedColumn +__all__ = ["get_news_events", "NewsEvent", "CurrentEvents"] + + class NewsEvent(BaseModel): """Represents a single current event item.""" diff --git a/src/utils/tools/kb_weaviate.py b/aieng-agents/aieng/agents/tools/weaviate_kb.py similarity index 97% rename from src/utils/tools/kb_weaviate.py rename to aieng-agents/aieng/agents/tools/weaviate_kb.py index d8c45e3d..1c3c4667 100644 --- a/src/utils/tools/kb_weaviate.py +++ b/aieng-agents/aieng/agents/tools/weaviate_kb.py @@ -8,10 +8,12 @@ import openai import pydantic import weaviate +from aieng.agents.async_utils import rate_limited +from aieng.agents.env_vars import Configs from weaviate import WeaviateAsyncClient -from ..async_utils import rate_limited -from ..env_vars import Configs + +__all__ = ["AsyncWeaviateKnowledgeBase", "get_weaviate_async_client"] class _Source(pydantic.BaseModel): diff --git a/src/utils/web_search/.dockerignore b/aieng-agents/aieng/agents/web_search/.dockerignore similarity index 100% rename from src/utils/web_search/.dockerignore rename to aieng-agents/aieng/agents/web_search/.dockerignore diff --git a/src/utils/web_search/.env.example b/aieng-agents/aieng/agents/web_search/.env.example similarity index 100% rename from src/utils/web_search/.env.example rename to aieng-agents/aieng/agents/web_search/.env.example diff --git a/src/utils/web_search/Dockerfile b/aieng-agents/aieng/agents/web_search/Dockerfile similarity index 100% rename from src/utils/web_search/Dockerfile rename to aieng-agents/aieng/agents/web_search/Dockerfile diff --git a/src/utils/web_search/README.md b/aieng-agents/aieng/agents/web_search/README.md similarity index 93% rename from src/utils/web_search/README.md rename to aieng-agents/aieng/agents/web_search/README.md index 90bb5e8d..e0f58725 100644 --- a/src/utils/web_search/README.md +++ b/aieng-agents/aieng/agents/web_search/README.md @@ -1,6 +1,6 @@ # Gemini Grounding Proxy -This service packages the code in `src/utils/web_search` into a FastAPI +This service packages the code in `aieng-agents/aieng/agents/web_search` into a FastAPI application. It plays a dual role in the Agent Bootcamp project: - **Agent tooling showcase.** The proxy demonstrates how you can wrap a third-party @@ -24,6 +24,7 @@ production deployment on Google Cloud Run. - Access to a Google Cloud project with billing enabled Recommended: + - `uv` or `pip` for dependency management - Ability to set environment variables from `.env` files @@ -96,10 +97,10 @@ Keep `.env.example` up to date so teammates can copy it into their own `.env`. ```bash python -m venv .venv source .venv/bin/activate - pip install -r src/utils/web_search/requirements-app.txt + pip install -r aieng-agents/aieng/agents/web_search/requirements-app.txt ``` - (Or use `uv pip install -r src/utils/web_search/requirements-app.txt`.) + (Or use `uv pip install -r aieng-agents/aieng/agents/web_search/requirements-app.txt`.) 5. **Run unit tests** @@ -112,7 +113,7 @@ Keep `.env.example` up to date so teammates can copy it into their own `.env`. ```bash uvicorn utils.web_search.app:app \ --reload \ - --reload-dir src/utils/web_search \ + --reload-dir aieng-agents/aieng/agents/web_search \ --port 8080 ``` @@ -128,8 +129,8 @@ Keep `.env.example` up to date so teammates can copy it into their own `.env`. from google.auth.credentials import AnonymousCredentials from google.cloud import firestore - from src.utils.web_search.auth import APIKeyAuthenticator - from src.utils.web_search.db import APIKeyRepository + from aieng.agents.web_search.auth import APIKeyAuthenticator + from aieng.agents.web_search.db import APIKeyRepository async def main(): client = firestore.AsyncClient( @@ -224,7 +225,7 @@ export REGION=us-central1 export IMAGE_NAME=grounding-proxy export TAG=$(date +%Y%m%d%H%M) -gcloud builds submit src/utils/web_search \ +gcloud builds submit aieng-agents/aieng/agents/web_search \ --tag "$REGION-docker.pkg.dev/$PROJECT/web-search/$IMAGE_NAME:$TAG" ``` @@ -254,8 +255,8 @@ production project: ```python import asyncio from google.cloud import firestore -from utils.web_search.auth import APIKeyAuthenticator -from utils.web_search.db import APIKeyRepository +from aieng.agents.web_search.auth import APIKeyAuthenticator +from aieng.agents.web_search.db import APIKeyRepository PROJECT = "your-project-id" DATABASE = "grounding" diff --git a/aieng-agents/aieng/agents/web_search/__init__.py b/aieng-agents/aieng/agents/web_search/__init__.py new file mode 100644 index 00000000..cdf56218 --- /dev/null +++ b/aieng-agents/aieng/agents/web_search/__init__.py @@ -0,0 +1 @@ +"""Implementation of proxy service for Gemini web grounding with Google Search.""" diff --git a/src/utils/web_search/app.py b/aieng-agents/aieng/agents/web_search/app.py similarity index 100% rename from src/utils/web_search/app.py rename to aieng-agents/aieng/agents/web_search/app.py diff --git a/src/utils/web_search/auth.py b/aieng-agents/aieng/agents/web_search/auth.py similarity index 100% rename from src/utils/web_search/auth.py rename to aieng-agents/aieng/agents/web_search/auth.py diff --git a/src/utils/web_search/daily_usage.py b/aieng-agents/aieng/agents/web_search/daily_usage.py similarity index 100% rename from src/utils/web_search/daily_usage.py rename to aieng-agents/aieng/agents/web_search/daily_usage.py diff --git a/src/utils/web_search/db.py b/aieng-agents/aieng/agents/web_search/db.py similarity index 100% rename from src/utils/web_search/db.py rename to aieng-agents/aieng/agents/web_search/db.py diff --git a/src/utils/web_search/requirements-app.txt b/aieng-agents/aieng/agents/web_search/requirements-app.txt similarity index 89% rename from src/utils/web_search/requirements-app.txt rename to aieng-agents/aieng/agents/web_search/requirements-app.txt index 2063260a..384440a5 100644 --- a/src/utils/web_search/requirements-app.txt +++ b/aieng-agents/aieng/agents/web_search/requirements-app.txt @@ -1,5 +1,5 @@ # This file was autogenerated by uv via the following command: -# uv pip compile src/utils/web_search/requirements_app.in -o src/utils/web_search/requirements-app.txt +# uv pip compile aieng-agents/aieng/agents/web_search/requirements_app.in -o aieng-agents/aieng/agents/web_search/requirements-app.txt annotated-doc==0.0.3 # via fastapi annotated-types==0.7.0 @@ -32,7 +32,7 @@ email-validator==2.3.0 # fastapi # pydantic fastapi==0.120.2 - # via -r src/utils/web_search/requirements_app.in + # via -r aieng-agents/aieng/agents/web_search/requirements_app.in fastapi-cli==0.0.14 # via fastapi fastapi-cloud-cli==0.3.1 @@ -50,9 +50,9 @@ google-auth==2.42.0 google-cloud-core==2.4.3 # via google-cloud-firestore google-cloud-firestore==2.21.0 - # via -r src/utils/web_search/requirements_app.in + # via -r aieng-agents/aieng/agents/web_search/requirements_app.in google-genai==1.46.0 - # via -r src/utils/web_search/requirements_app.in + # via -r aieng-agents/aieng/agents/web_search/requirements_app.in googleapis-common-protos==1.71.0 # via # google-api-core @@ -109,7 +109,7 @@ pyasn1-modules==0.4.2 # via google-auth pydantic==2.12.3 # via - # -r src/utils/web_search/requirements_app.in + # -r aieng-agents/aieng/agents/web_search/requirements_app.in # fastapi # fastapi-cloud-cli # google-genai diff --git a/src/utils/web_search/requirements_app.in b/aieng-agents/aieng/agents/web_search/requirements_app.in similarity index 100% rename from src/utils/web_search/requirements_app.in rename to aieng-agents/aieng/agents/web_search/requirements_app.in diff --git a/aieng-agents/pyproject.toml b/aieng-agents/pyproject.toml new file mode 100644 index 00000000..24490338 --- /dev/null +++ b/aieng-agents/pyproject.toml @@ -0,0 +1,57 @@ +[project] +name = "aieng-agents" +version = "0.1.0" +description = "Helper modules for Vector Institute AI Engineering Agents Bootcamp implementations" +authors = [{name = "Vector AI Engineering", email = "ai_engineering@vectorinstitute.ai"}] +requires-python = ">=3.12" +readme = "README.md" +license = "MIT" +dependencies = [ + "backoff>=2.2.1", + "beautifulsoup4>=4.13.4", + "click>=8.3.0", + "datasets>=4.4.0", + "e2b-code-interpreter>=2.3.0", + "fastapi[standard]>=0.116.1", + "google-cloud-firestore>=2.21.0", + "google-genai>=1.46.0", + "gradio>=6.7.0", + "httpx>=0.28.1", + "langfuse>=3.9.0", + "lxml>=6.0.0", + "nest-asyncio>=1.6.0", + "openai>=2.6.0", + "openai-agents>=0.4.0", + "pandas>=2.3.3", + "pillow>=12.1.1", + "pydantic>=2.11.7", + "pydantic-ai-slim[logfire]>=0.3.7", + "pymupdf>=1.26.7", + "simplejson>=3.20.2", + "transformers>=4.54.1", + "weaviate-client>=4.15.4", +] + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[tool.hatch.build.targets.sdist] +include = ["aieng/"] + +[tool.hatch.build.targets.wheel] +include = ["aieng/"] + +[dependency-groups] +dev = [ + "pytest>=8.3.4", + "pytest-asyncio>=1.2.0", +] + +# Default dependency groups to be installed +[tool.uv] +default-groups = ["dev"] + +[project.scripts] +pdf_to_hf_dataset = "aieng.agents.data.pdf_to_hf_dataset:main" +chunk_hf_dataset = "aieng.agents.data.chunk_hf_dataset:main" diff --git a/aieng-agents/tests/README.md b/aieng-agents/tests/README.md new file mode 100644 index 00000000..60f5f442 --- /dev/null +++ b/aieng-agents/tests/README.md @@ -0,0 +1,10 @@ +# Unit tests + +```bash +uv run --env-file .env pytest -sv aieng-agents/tests/data/test_load_hf.py +uv run --env-file .env pytest -sv aieng-agents/tests/tools/test_weaviate.py +uv run --env-file .env pytest -sv aieng-agents/tests/tools/test_code_interpreter.py +uv run --env-file .env pytest -sv aieng-agents/tests/tools/test_gemini_grounding.py +uv run --env-file .env pytest -sv aieng-agents/tests/tools/test_get_news_events.py +uv run --env-file .env pytest -sv aieng-agents/tests/web_search/test_web_search_auth.py +``` diff --git a/tests/data_tests/test_load_hf.py b/aieng-agents/tests/data/test_load_hf.py similarity index 82% rename from tests/data_tests/test_load_hf.py rename to aieng-agents/tests/data/test_load_hf.py index f2e6e64c..0c8ac6b6 100644 --- a/tests/data_tests/test_load_hf.py +++ b/aieng-agents/tests/data/test_load_hf.py @@ -1,11 +1,10 @@ """Test Loading HuggingFace datasets.""" import pandas as pd +from aieng.agents.data import get_dataset, get_dataset_url_hash -from src.utils.data import get_dataset, get_dataset_url_hash - -def test_load_from_hub_unspecified_subset(): +def test_load_from_hub_unspecified_subset() -> None: """Test loading dataset from hub, no subset specified.""" url = "hf://vector-institute/hotpotqa@d997ecf:train" rows_limit = 18 @@ -15,7 +14,7 @@ def test_load_from_hub_unspecified_subset(): assert len(dataset) == rows_limit -def test_load_from_hub_named_subset(): +def test_load_from_hub_named_subset() -> None: """Test loading dataset from hub, no subset specified.""" url = "hf://vector-institute/hotpotqa@d997ecf:train" rows_limit = 18 diff --git a/tests/tool_tests/example_files/example_a.csv b/aieng-agents/tests/example_files/example_a.csv similarity index 100% rename from tests/tool_tests/example_files/example_a.csv rename to aieng-agents/tests/example_files/example_a.csv diff --git a/tests/tool_tests/test_code_interpreter.py b/aieng-agents/tests/tools/test_code_interpreter.py similarity index 81% rename from tests/tool_tests/test_code_interpreter.py rename to aieng-agents/tests/tools/test_code_interpreter.py index 6318c7bf..a3342958 100644 --- a/tests/tool_tests/test_code_interpreter.py +++ b/aieng-agents/tests/tools/test_code_interpreter.py @@ -3,12 +3,8 @@ from pathlib import Path import pytest - -from src.utils import pretty_print -from src.utils.tools.code_interpreter import ( - CodeInterpreter, - CodeInterpreterOutput, -) +from aieng.agents import pretty_print +from aieng.agents.tools.code_interpreter import CodeInterpreter, CodeInterpreterOutput PANDAS_VERSION_SCRIPT = """\ @@ -28,7 +24,7 @@ @pytest.mark.asyncio -async def test_code_interpreter(): +async def test_code_interpreter() -> None: """Test running a Python command in the interpreter.""" session = CodeInterpreter(timeout_seconds=15) @@ -42,7 +38,7 @@ async def test_code_interpreter(): @pytest.mark.asyncio -async def test_jupyter_command(): +async def test_jupyter_command() -> None: """Test running a Python command in the interpreter.""" session = CodeInterpreter(timeout_seconds=15) @@ -53,9 +49,9 @@ async def test_jupyter_command(): @pytest.mark.asyncio -async def test_code_interpreter_upload_file(): +async def test_code_interpreter_upload_file() -> None: """Test running a Python command in the interpreter.""" - example_paths = [Path("tests/tool_tests/example_files/example_a.csv")] + example_paths = [Path("aieng-agents/tests/example_files/example_a.csv")] for _path in example_paths: assert _path.exists() diff --git a/tests/tool_tests/test_gemini_grounding.py b/aieng-agents/tests/tools/test_gemini_grounding.py similarity index 77% rename from tests/tool_tests/test_gemini_grounding.py rename to aieng-agents/tests/tools/test_gemini_grounding.py index 2f9a7134..9b44d0b8 100644 --- a/tests/tool_tests/test_gemini_grounding.py +++ b/aieng-agents/tests/tools/test_gemini_grounding.py @@ -3,13 +3,12 @@ import os import pytest - -from src.utils import pretty_print -from src.utils.tools.gemini_grounding import GeminiGroundingWithGoogleSearch +from aieng.agents import pretty_print +from aieng.agents.tools.gemini_grounding import GeminiGroundingWithGoogleSearch @pytest.mark.asyncio -async def test_web_search_with_gemini_grounding(): +async def test_web_search_with_gemini_grounding() -> None: """Test Gemini grounding with Google Search integration.""" # Check if the environment variable is set assert os.getenv("WEB_SEARCH_BASE_URL") diff --git a/tests/tool_tests/test_get_news_events.py b/aieng-agents/tests/tools/test_get_news_events.py similarity index 85% rename from tests/tool_tests/test_get_news_events.py rename to aieng-agents/tests/tools/test_get_news_events.py index 4d71b31d..8986de7d 100644 --- a/tests/tool_tests/test_get_news_events.py +++ b/aieng-agents/tests/tools/test_get_news_events.py @@ -1,12 +1,11 @@ """Test the tool for getting news events.""" import pytest - -from src.utils.tools import get_news_events +from aieng.agents.tools import get_news_events @pytest.mark.asyncio -async def test_get_news_events(): +async def test_get_news_events() -> None: """Test tool for retrieving news events from enwiki.""" events_by_category = await get_news_events() all_events = [item for items in events_by_category.root.values() for item in items] diff --git a/tests/tool_tests/test_weaviate.py b/aieng-agents/tests/tools/test_weaviate.py similarity index 79% rename from tests/tool_tests/test_weaviate.py rename to aieng-agents/tests/tools/test_weaviate.py index 474a93b6..efdd3634 100644 --- a/tests/tool_tests/test_weaviate.py +++ b/aieng-agents/tests/tools/test_weaviate.py @@ -1,28 +1,28 @@ """Test cases for Weaviate integration.""" +from typing import Any, AsyncGenerator + import pytest import pytest_asyncio -from dotenv import load_dotenv - -from src.utils import ( +from aieng.agents import Configs, pretty_print +from aieng.agents.tools.weaviate_kb import ( AsyncWeaviateKnowledgeBase, - Configs, get_weaviate_async_client, - pretty_print, ) +from dotenv import load_dotenv load_dotenv(verbose=True) @pytest.fixture() -def configs(): +def configs() -> Any: """Load env var configs for testing.""" return Configs() @pytest_asyncio.fixture() -async def weaviate_kb(configs): +async def weaviate_kb(configs) -> AsyncGenerator[Any, Any]: """Weaviate knowledgebase for testing.""" async_client = get_weaviate_async_client(configs) @@ -34,7 +34,7 @@ async def weaviate_kb(configs): @pytest.mark.asyncio -async def test_weaviate_kb(weaviate_kb: AsyncWeaviateKnowledgeBase): +async def test_weaviate_kb(weaviate_kb: AsyncWeaviateKnowledgeBase) -> None: """Test weaviate knowledgebase integration.""" responses = await weaviate_kb.search_knowledgebase("What is Toronto known for?") assert len(responses) > 0 diff --git a/tests/test_web_search_auth.py b/aieng-agents/tests/web_search/test_web_search_auth.py similarity index 99% rename from tests/test_web_search_auth.py rename to aieng-agents/tests/web_search/test_web_search_auth.py index 7afb4b40..088c61e5 100644 --- a/tests/test_web_search_auth.py +++ b/aieng-agents/tests/web_search/test_web_search_auth.py @@ -5,9 +5,8 @@ from typing import Optional import pytest - -from src.utils.web_search import auth -from src.utils.web_search.db import ( +from aieng.agents.web_search import auth +from aieng.agents.web_search.db import ( APIKeyNotFoundError, APIKeyRecord, Status, diff --git a/src/1_basics/0_search_demo/README.md b/implementations/1_basics/0_search_demo/README.md similarity index 100% rename from src/1_basics/0_search_demo/README.md rename to implementations/1_basics/0_search_demo/README.md diff --git a/src/1_basics/0_search_demo/app.py b/implementations/1_basics/0_search_demo/app.py similarity index 97% rename from src/1_basics/0_search_demo/app.py rename to implementations/1_basics/0_search_demo/app.py index 4c29510c..fc4b5c09 100644 --- a/src/1_basics/0_search_demo/app.py +++ b/implementations/1_basics/0_search_demo/app.py @@ -3,10 +3,9 @@ import asyncio import gradio as gr +from aieng.agents import AsyncClientManager, pretty_print from dotenv import load_dotenv -from src.utils import AsyncClientManager, pretty_print - DESCRIPTION = """\ In the example below, your goal is to find out where \ diff --git a/src/1_basics/1_react_rag/README.md b/implementations/1_basics/1_react_rag/README.md similarity index 100% rename from src/1_basics/1_react_rag/README.md rename to implementations/1_basics/1_react_rag/README.md diff --git a/src/1_basics/1_react_rag/app.py b/implementations/1_basics/1_react_rag/app.py similarity index 98% rename from src/1_basics/1_react_rag/app.py rename to implementations/1_basics/1_react_rag/app.py index 354ca463..04d1fcd3 100644 --- a/src/1_basics/1_react_rag/app.py +++ b/implementations/1_basics/1_react_rag/app.py @@ -8,12 +8,11 @@ from typing import TYPE_CHECKING, Any, AsyncGenerator import gradio as gr +from aieng.agents.client_manager import AsyncClientManager +from aieng.agents.prompts import REACT_INSTRUCTIONS from dotenv import load_dotenv from gradio.components.chatbot import ChatMessage -from src.prompts import REACT_INSTRUCTIONS -from src.utils.client_manager import AsyncClientManager - if TYPE_CHECKING: from openai.types.chat import ( diff --git a/src/1_basics/1_react_rag/cli.py b/implementations/1_basics/1_react_rag/cli.py similarity index 97% rename from src/1_basics/1_react_rag/cli.py rename to implementations/1_basics/1_react_rag/cli.py index 007e466a..f6ff6efd 100644 --- a/src/1_basics/1_react_rag/cli.py +++ b/implementations/1_basics/1_react_rag/cli.py @@ -4,14 +4,11 @@ import json from typing import TYPE_CHECKING +from aieng.agents import pretty_print +from aieng.agents.client_manager import AsyncClientManager +from aieng.agents.prompts import REACT_INSTRUCTIONS from dotenv import load_dotenv -from src.prompts import REACT_INSTRUCTIONS -from src.utils import ( - AsyncClientManager, - pretty_print, -) - if TYPE_CHECKING: from openai.types.chat import ChatCompletionToolParam diff --git a/src/1_basics/__init__.py b/implementations/1_basics/__init__.py similarity index 100% rename from src/1_basics/__init__.py rename to implementations/1_basics/__init__.py diff --git a/src/2_frameworks/1_react_rag/README.md b/implementations/2_frameworks/1_react_rag/README.md similarity index 100% rename from src/2_frameworks/1_react_rag/README.md rename to implementations/2_frameworks/1_react_rag/README.md diff --git a/src/2_frameworks/1_react_rag/app.py b/implementations/2_frameworks/1_react_rag/app.py similarity index 67% rename from src/2_frameworks/1_react_rag/app.py rename to implementations/2_frameworks/1_react_rag/app.py index 744a99df..bd261f56 100644 --- a/src/2_frameworks/1_react_rag/app.py +++ b/implementations/2_frameworks/1_react_rag/app.py @@ -1,19 +1,40 @@ """Reason-and-Act Knowledge Retrieval Agent via the OpenAI Agent SDK.""" -import asyncio -import logging from typing import Any, AsyncGenerator import agents import gradio as gr +from aieng.agents import ( + get_or_create_agent_session, + oai_agent_stream_to_gradio_messages, + register_async_cleanup, + set_up_logging, +) +from aieng.agents.client_manager import AsyncClientManager +from aieng.agents.gradio import get_common_gradio_config +from aieng.agents.prompts import REACT_INSTRUCTIONS from dotenv import load_dotenv from gradio.components.chatbot import ChatMessage -from src.prompts import REACT_INSTRUCTIONS -from src.utils import oai_agent_stream_to_gradio_messages -from src.utils.agent_session import get_or_create_session -from src.utils.client_manager import AsyncClientManager -from src.utils.gradio import COMMON_GRADIO_CONFIG + +load_dotenv(verbose=True) + +# Set logging level and suppress some noisy logs from dependencies +set_up_logging() + +# Disable tracing to OpenAI platform since we are using Gemini models instead +# of OpenAI models +agents.set_tracing_disabled(disabled=True) + +if gr.NO_RELOAD: + # Initialize client manager + # This class initializes the OpenAI and Weaviate async clients, as well as the + # Weaviate knowledge base tool. The initialization is done once when the clients + # are first accessed, and the clients are reused for subsequent calls. + client_manager = AsyncClientManager() + + # Register async cleanup to ensure clients are properly closed on program exit + register_async_cleanup(client_manager) async def _main( @@ -26,7 +47,7 @@ async def _main( # conversation history across multiple turns of a chat # This makes it possible to ask follow-up questions that refer to # previous turns in the conversation - session = get_or_create_session(history, session_state) + session = get_or_create_agent_session(history, session_state) # Define an agent using the OpenAI Agent SDK main_agent = agents.Agent( @@ -54,33 +75,17 @@ async def _main( yield turn_messages -if __name__ == "__main__": - load_dotenv(verbose=True) - logging.basicConfig(level=logging.INFO) - - # Initialize client manager - # This class initializes the OpenAI and Weaviate async clients, as well as the - # Weaviate knowledge base tool. The initialization is done once when the clients - # are first accessed, and the clients are reused for subsequent calls. - client_manager = AsyncClientManager() - - # Disable tracing to OpenAI platform since we are using Gemini models instead - # of OpenAI models - agents.set_tracing_disabled(disabled=True) - - demo = gr.ChatInterface( - _main, - **COMMON_GRADIO_CONFIG, - examples=[ - [ - "At which university did the SVP Software Engineering" - " at Apple (as of June 2025) earn their engineering degree?" - ], +demo = gr.ChatInterface( + _main, + **get_common_gradio_config(), + examples=[ + [ + "At which university did the SVP Software Engineering" + " at Apple (as of June 2025) earn their engineering degree?" ], - title="2.1: ReAct for Retrieval-Augmented Generation with OpenAI Agent SDK", - ) + ], + title="2.1: ReAct for Retrieval-Augmented Generation with OpenAI Agent SDK", +) - try: - demo.launch(share=True) - finally: - asyncio.run(client_manager.close()) +if __name__ == "__main__": + demo.launch(share=True) diff --git a/implementations/2_frameworks/1_react_rag/cli.py b/implementations/2_frameworks/1_react_rag/cli.py new file mode 100644 index 00000000..992c8410 --- /dev/null +++ b/implementations/2_frameworks/1_react_rag/cli.py @@ -0,0 +1,75 @@ +"""Non-Interactive Example of OpenAI Agent SDK for Knowledge Retrieval.""" + +import asyncio + +from agents import ( + Agent, + OpenAIChatCompletionsModel, + RunConfig, + Runner, + function_tool, +) +from aieng.agents import pretty_print, set_up_logging +from aieng.agents.client_manager import AsyncClientManager +from aieng.agents.prompts import REACT_INSTRUCTIONS +from dotenv import load_dotenv + + +load_dotenv(verbose=True) + +# Set logging level and suppress some noisy logs from dependencies +set_up_logging() + + +async def _main(query: str) -> None: + try: + wikipedia_agent = Agent( + name="Wikipedia Agent", + instructions=REACT_INSTRUCTIONS, + tools=[function_tool(client_manager.knowledgebase.search_knowledgebase)], + model=OpenAIChatCompletionsModel( + model=client_manager.configs.default_worker_model, + openai_client=client_manager.openai_client, + ), + ) + + response = await Runner.run( + wikipedia_agent, input=query, run_config=no_tracing_config + ) + + for item in response.new_items: + pretty_print(item.raw_item) + print() + + pretty_print(response.final_output) + + # Uncomment the following for a basic "streaming" example + + # from implementations.utils import oai_agent_stream_to_gradio_messages + # result_stream = Runner.run_streamed( + # wikipedia_agent, input=query, run_config=no_tracing_config + # ) + # async for event in result_stream.stream_events(): + # event_parsed = oai_agent_stream_to_gradio_messages(event) + # if len(event_parsed) > 0: + # pretty_print(event_parsed) + finally: + # Ensure clients are closed on exit + await client_manager.close() + + +if __name__ == "__main__": + no_tracing_config = RunConfig(tracing_disabled=True) + + # Initialize client manager + # This class initializes the OpenAI and Weaviate async clients, as well as the + # Weaviate knowledge base tool. The initialization is done once when the clients + # are first accessed, and the clients are reused for subsequent calls. + client_manager = AsyncClientManager() + + query = ( + "At which university did the SVP Software Engineering" + " at Apple (as of June 2025) earn their engineering degree?" + ) + + asyncio.run(_main(query)) diff --git a/src/2_frameworks/1_react_rag/langfuse_gradio.py b/implementations/2_frameworks/1_react_rag/langfuse_gradio.py similarity index 74% rename from src/2_frameworks/1_react_rag/langfuse_gradio.py rename to implementations/2_frameworks/1_react_rag/langfuse_gradio.py index 1acae713..9ed8519e 100644 --- a/src/2_frameworks/1_react_rag/langfuse_gradio.py +++ b/implementations/2_frameworks/1_react_rag/langfuse_gradio.py @@ -3,26 +3,43 @@ Log traces to LangFuse for observability and evaluation. """ -import asyncio from typing import Any, AsyncGenerator import agents import gradio as gr -from dotenv import load_dotenv -from gradio.components.chatbot import ChatMessage -from langfuse import propagate_attributes - -from src.prompts import REACT_INSTRUCTIONS -from src.utils import ( +from aieng.agents import ( + get_or_create_agent_session, oai_agent_stream_to_gradio_messages, pretty_print, + register_async_cleanup, set_up_logging, - setup_langfuse_tracer, ) -from src.utils.agent_session import get_or_create_session -from src.utils.client_manager import AsyncClientManager -from src.utils.gradio import COMMON_GRADIO_CONFIG -from src.utils.langfuse.shared_client import langfuse_client +from aieng.agents.client_manager import AsyncClientManager +from aieng.agents.gradio import get_common_gradio_config +from aieng.agents.langfuse import langfuse_client, setup_langfuse_tracer +from aieng.agents.prompts import REACT_INSTRUCTIONS +from dotenv import load_dotenv +from gradio.components.chatbot import ChatMessage +from langfuse import propagate_attributes + + +load_dotenv(verbose=True) + +# Set logging level and suppress some noisy logs from dependencies +set_up_logging() + +if gr.NO_RELOAD: + # Set up LangFuse for tracing + setup_langfuse_tracer() + + # Initialize client manager + # This class initializes the OpenAI and Weaviate async clients, as well as the + # Weaviate knowledge base tool. The initialization is done once when the clients + # are first accessed, and the clients are reused for subsequent calls. + client_manager = AsyncClientManager() + + # Register async cleanup to ensure clients are properly closed on program exit + register_async_cleanup(client_manager) async def _main( @@ -35,7 +52,7 @@ async def _main( # conversation history across multiple turns of a chat # This makes it possible to ask follow-up questions that refer to # previous turns in the conversation - session = get_or_create_session(history, session_state) + session = get_or_create_agent_session(history, session_state) # Define an agent using the OpenAI Agent SDK main_agent = agents.Agent( @@ -76,34 +93,17 @@ async def _main( yield turn_messages -if __name__ == "__main__": - load_dotenv(verbose=True) - - # Set logging level and suppress some noisy logs from dependencies - set_up_logging() - - # Set up LangFuse for tracing - setup_langfuse_tracer() - - # Initialize client manager - # This class initializes the OpenAI and Weaviate async clients, as well as the - # Weaviate knowledge base tool. The initialization is done once when the clients - # are first accessed, and the clients are reused for subsequent calls. - client_manager = AsyncClientManager() - - demo = gr.ChatInterface( - _main, - **COMMON_GRADIO_CONFIG, - examples=[ - [ - "At which university did the SVP Software Engineering" - " at Apple (as of June 2025) earn their engineering degree?", - ], +demo = gr.ChatInterface( + _main, + **get_common_gradio_config(), + examples=[ + [ + "At which university did the SVP Software Engineering" + " at Apple (as of June 2025) earn their engineering degree?", ], - title="2.1: ReAct for Retrieval-Augmented Generation with OpenAI Agent SDK + LangFuse", - ) + ], + title="2.1: ReAct for Retrieval-Augmented Generation with OpenAI Agent SDK + LangFuse", +) - try: - demo.launch(share=True) - finally: - asyncio.run(client_manager.close()) +if __name__ == "__main__": + demo.launch(share=True) diff --git a/src/2_frameworks/2_multi_agent/README.md b/implementations/2_frameworks/2_multi_agent/README.md similarity index 100% rename from src/2_frameworks/2_multi_agent/README.md rename to implementations/2_frameworks/2_multi_agent/README.md diff --git a/src/2_frameworks/2_multi_agent/efficient.py b/implementations/2_frameworks/2_multi_agent/efficient.py similarity index 57% rename from src/2_frameworks/2_multi_agent/efficient.py rename to implementations/2_frameworks/2_multi_agent/efficient.py index e1ec98d0..43ad795d 100644 --- a/src/2_frameworks/2_multi_agent/efficient.py +++ b/implementations/2_frameworks/2_multi_agent/efficient.py @@ -6,70 +6,31 @@ /blob/3304e6e/misinfo_data_eval/tasks/web_search.py """ -import asyncio from typing import Any, AsyncGenerator import agents import gradio as gr -from dotenv import load_dotenv -from gradio.components.chatbot import ChatMessage -from langfuse import propagate_attributes - -from src.prompts import REACT_INSTRUCTIONS -from src.utils import ( +from aieng.agents import ( + get_or_create_agent_session, oai_agent_stream_to_gradio_messages, + register_async_cleanup, set_up_logging, - setup_langfuse_tracer, ) -from src.utils.agent_session import get_or_create_session -from src.utils.client_manager import AsyncClientManager -from src.utils.gradio import COMMON_GRADIO_CONFIG -from src.utils.langfuse.shared_client import langfuse_client - - -async def _main( - query: str, history: list[ChatMessage], session_state: dict[str, Any] -) -> AsyncGenerator[list[ChatMessage], Any]: - # Initialize list of chat messages for a single turn - turn_messages: list[ChatMessage] = [] - - # Construct an in-memory SQLite session for the agent to maintain - # conversation history across multiple turns of a chat - # This makes it possible to ask follow-up questions that refer to - # previous turns in the conversation - session = get_or_create_session(history, session_state) - - # Use the main agent as the entry point- not the worker agent. - with ( - langfuse_client.start_as_current_observation( - name="Orchestrator-Worker", as_type="agent", input=query - ) as obs, - propagate_attributes( - session_id=session.session_id # Propagate session_id to all child observations - ), - ): - # Run the agent in streaming mode to get and display intermediate outputs - result_stream = agents.Runner.run_streamed( - main_agent, - input=query, - session=session, - max_turns=30, # Increase max turns to support more complex queries - ) - - async for _item in result_stream.stream_events(): - turn_messages += oai_agent_stream_to_gradio_messages(_item) - if len(turn_messages) > 0: - yield turn_messages - - obs.update(output=result_stream.final_output) +from aieng.agents.client_manager import AsyncClientManager +from aieng.agents.gradio import get_common_gradio_config +from aieng.agents.langfuse import langfuse_client, setup_langfuse_tracer +from aieng.agents.prompts import REACT_INSTRUCTIONS, SEARCH_AGENT_INSTRUCTIONS +from dotenv import load_dotenv +from gradio.components.chatbot import ChatMessage +from langfuse import propagate_attributes -if __name__ == "__main__": - load_dotenv(verbose=True) +load_dotenv(verbose=True) - # Set logging level and suppress some noisy logs from dependencies - set_up_logging() +# Set logging level and suppress some noisy logs from dependencies +set_up_logging() +if gr.NO_RELOAD: # Set up LangFuse for tracing setup_langfuse_tracer() @@ -79,34 +40,27 @@ async def _main( # are first accessed, and the clients are reused for subsequent calls. client_manager = AsyncClientManager() - # Use smaller, faster model for focused search tasks - worker_model = client_manager.configs.default_worker_model - # Use larger, more capable model for complex planning and reasoning - planner_model = client_manager.configs.default_planner_model + # Register async cleanup to ensure clients are properly closed on program exit + register_async_cleanup(client_manager) + +def _get_main_agent() -> agents.Agent: # Worker Agent: handles long context efficiently search_agent = agents.Agent( name="SearchAgent", - instructions=( - "You are a search agent. You receive a single search query as input. " - "Use the search tool to perform a search, then produce a concise " - "'search summary' of the key findings. " - "For every fact you include in the summary, ALWAYS include a citation " - "both in-line and at the end of the summary as a numbered list. The " - "citation at the end should include relevant metadata from the search " - "results. Do NOT return raw search results. " - ), + instructions=SEARCH_AGENT_INSTRUCTIONS, tools=[ agents.function_tool(client_manager.knowledgebase.search_knowledgebase), ], - # a faster, smaller model for quick searches + # Use smaller, faster model for focused search tasks model=agents.OpenAIChatCompletionsModel( - model=worker_model, openai_client=client_manager.openai_client + model=client_manager.configs.default_worker_model, + openai_client=client_manager.openai_client, ), ) # Main Agent: more expensive and slower, but better at complex planning - main_agent = agents.Agent( + return agents.Agent( name="MainAgent", instructions=REACT_INSTRUCTIONS, # Allow the planner agent to invoke the worker agent. @@ -117,33 +71,72 @@ async def _main( tool_description="Perform a search on a Wikipedia knowledge base for a query and return a concise summary.", ) ], - # a larger, more capable model for planning and reasoning over summaries + # Use larger, more capable model for complex planning and reasoning over + # summaries model=agents.OpenAIChatCompletionsModel( - model=planner_model, openai_client=client_manager.openai_client + model=client_manager.configs.default_planner_model, + openai_client=client_manager.openai_client, ), # NOTE: enabling parallel tool calls here can sometimes lead to issues with # with invalid arguments being passed to the search agent. model_settings=agents.ModelSettings(parallel_tool_calls=False), ) - demo = gr.ChatInterface( - _main, - **COMMON_GRADIO_CONFIG, - examples=[ - [ - "Write a structured report on the history of AI, covering: " - "1) the start in the 50s, 2) the first AI winter, 3) the second AI winter, " - "4) the modern AI boom, 5) the evolution of AI hardware, and " - "6) the societal impacts of modern AI" - ], - [ - "Compare the box office performance of 'Oppenheimer' with the third Avatar movie" - ], + +async def _main( + query: str, history: list[ChatMessage], session_state: dict[str, Any] +) -> AsyncGenerator[list[ChatMessage], Any]: + # Initialize list of chat messages for a single turn + turn_messages: list[ChatMessage] = [] + + # Construct an in-memory SQLite session for the agent to maintain + # conversation history across multiple turns of a chat + # This makes it possible to ask follow-up questions that refer to + # previous turns in the conversation + session = get_or_create_agent_session(history, session_state) + + # Use the main agent as the entry point - not the worker agent. + main_agent = _get_main_agent() + with ( + langfuse_client.start_as_current_observation( + name="Orchestrator-Worker", as_type="agent", input=query + ) as obs, + propagate_attributes( + session_id=session.session_id # Propagate session_id to all child observations + ), + ): + # Run the agent in streaming mode to get and display intermediate outputs + result_stream = agents.Runner.run_streamed( + main_agent, + input=query, + session=session, + max_turns=30, # Increase max turns to support more complex queries + ) + + async for _item in result_stream.stream_events(): + turn_messages += oai_agent_stream_to_gradio_messages(_item) + if len(turn_messages) > 0: + yield turn_messages + + obs.update(output=result_stream.final_output) + + +demo = gr.ChatInterface( + _main, + **get_common_gradio_config(), + examples=[ + [ + "Write a structured report on the history of AI, covering: " + "1) the start in the 50s, 2) the first AI winter, 3) the second AI winter, " + "4) the modern AI boom, 5) the evolution of AI hardware, and " + "6) the societal impacts of modern AI" ], - title="2.2.2: Multi-Agent Orchestrator-worker for Retrieval-Augmented Generation", - ) + [ + "Compare the box office performance of 'Oppenheimer' with the third Avatar movie" + ], + ], + title="2.2.2: Multi-Agent Orchestrator-worker for Retrieval-Augmented Generation", +) - try: - demo.launch(share=True) - finally: - asyncio.run(client_manager.close()) +if __name__ == "__main__": + demo.launch(share=True) diff --git a/src/2_frameworks/2_multi_agent/efficient_multiple_kbs.py b/implementations/2_frameworks/2_multi_agent/efficient_multiple_kbs.py similarity index 54% rename from src/2_frameworks/2_multi_agent/efficient_multiple_kbs.py rename to implementations/2_frameworks/2_multi_agent/efficient_multiple_kbs.py index 1f3444f0..b992047e 100644 --- a/src/2_frameworks/2_multi_agent/efficient_multiple_kbs.py +++ b/implementations/2_frameworks/2_multi_agent/efficient_multiple_kbs.py @@ -1,72 +1,34 @@ """Example code for planner-worker agent collaboration with multiple tools.""" -import asyncio from typing import Any, AsyncGenerator import agents import gradio as gr -from dotenv import load_dotenv -from gradio.components.chatbot import ChatMessage -from langfuse import propagate_attributes - -from src.utils import ( +from aieng.agents import ( + get_or_create_agent_session, oai_agent_stream_to_gradio_messages, + register_async_cleanup, set_up_logging, - setup_langfuse_tracer, ) -from src.utils.agent_session import get_or_create_session -from src.utils.client_manager import AsyncClientManager -from src.utils.gradio import COMMON_GRADIO_CONFIG -from src.utils.langfuse.shared_client import langfuse_client -from src.utils.tools.gemini_grounding import ( +from aieng.agents.client_manager import AsyncClientManager +from aieng.agents.gradio import get_common_gradio_config +from aieng.agents.langfuse import langfuse_client, setup_langfuse_tracer +from aieng.agents.prompts import WIKI_AND_WEB_ORCHESTRATOR_INSTRUCTIONS +from aieng.agents.tools.gemini_grounding import ( GeminiGroundingWithGoogleSearch, ModelSettings, ) +from dotenv import load_dotenv +from gradio.components.chatbot import ChatMessage +from langfuse import propagate_attributes -async def _main( - query: str, history: list[ChatMessage], session_state: dict[str, Any] -) -> AsyncGenerator[list[ChatMessage], Any]: - # Initialize list of chat messages for a single turn - turn_messages: list[ChatMessage] = [] - - # Construct an in-memory SQLite session for the agent to maintain - # conversation history across multiple turns of a chat - # This makes it possible to ask follow-up questions that refer to - # previous turns in the conversation - session = get_or_create_session(history, session_state) - - # Use the main agent as the entry point- not the worker agent. - with ( - langfuse_client.start_as_current_observation( - name="Orchestrator-Worker", as_type="agent", input=query - ) as obs, - propagate_attributes( - session_id=session.session_id # Propagate session_id to all child observations - ), - ): - # Run the agent in streaming mode to get and display intermediate outputs - result_stream = agents.Runner.run_streamed( - main_agent, - input=query, - session=session, - max_turns=30, # Increase max turns to support more complex queries - ) - - async for _item in result_stream.stream_events(): - turn_messages += oai_agent_stream_to_gradio_messages(_item) - if len(turn_messages) > 0: - yield turn_messages - - obs.update(output=result_stream.final_output) - - -if __name__ == "__main__": - load_dotenv(verbose=True) +load_dotenv(verbose=True) - # Set logging level and suppress some noisy logs from dependencies - set_up_logging() +# Set logging level and suppress some noisy logs from dependencies +set_up_logging() +if gr.NO_RELOAD: # Set up LangFuse for tracing setup_langfuse_tracer() @@ -76,6 +38,11 @@ async def _main( # are first accessed, and the clients are reused for subsequent calls. client_manager = AsyncClientManager() + # Register async cleanup to ensure clients are properly closed on program exit + register_async_cleanup(client_manager) + + +def _get_main_agent() -> agents.Agent: # Use smaller, faster model for focused search tasks worker_model = client_manager.configs.default_worker_model # Use larger, more capable model for complex planning and reasoning @@ -107,58 +74,13 @@ async def _main( model=agents.OpenAIChatCompletionsModel( model=worker_model, openai_client=client_manager.openai_client ), + model_settings=agents.ModelSettings(parallel_tool_calls=False), ) # Main Agent: more expensive and slower, but better at complex planning - main_agent = agents.Agent( + return agents.Agent( name="MainAgent", - instructions=""" - You are a deep research agent and your goal is to conduct in-depth, multi-turn - research by breaking down complex queries, using the provided tools, and - synthesizing the information into a comprehensive report. - - You have access to the following tools: - 1. 'search_knowledgebase' - use this tool to search for information in a - knowledge base. The knowledge base reflects a subset of Wikipedia as - of May 2025. - 2. 'get_web_search_grounded_response' - use this tool for current events, - news, fact-checking or when the information in the knowledge base is - not sufficient to answer the question. - - Both tools will not return raw search results or the sources themselves. - Instead, they will return a concise summary of the key findings, along - with the sources used to generate the summary. - - For best performance, divide complex queries into simpler sub-queries - Before calling either tool, always explain your reasoning for doing so. - - Note that the 'get_web_search_grounded_response' tool will expand the query - into multiple search queries and execute them. It will also return the - queries it executed. Do not repeat them. - - **Routing Guidelines:** - - When answering a question, you should first try to use the 'search_knowledgebase' - tool, unless the question requires recent information after May 2025 or - has explicit recency cues. - - If either tool returns insufficient information for a given query, try - reformulating or using the other tool. You can call either tool multiple - times to get the information you need to answer the user's question. - - **Guidelines for synthesis** - - After collecting results, write the final answer from your own synthesis. - - Add a "Sources" section listing unique sources, formatted as: - [1] Publisher - URL - [2] Wikipedia: (Section:
) - Order by first mention in your text. Every factual sentence in your final - response must map to at least one source. - - If web and knowledge base disagree, surface the disagreement and prefer sources - with newer publication dates. - - Do not invent URLs or sources. - - If both tools fail, say so and suggest 2–3 refined queries. - - Be sure to mention the sources in your response, including the URL if available, - and do not make up information. - """, + instructions=WIKI_AND_WEB_ORCHESTRATOR_INSTRUCTIONS, # Allow the planner agent to invoke the worker agent. # The long context provided to the worker agent is hidden from the main agent. tools=[ @@ -184,24 +106,61 @@ async def _main( model_settings=agents.ModelSettings(parallel_tool_calls=False), ) - demo = gr.ChatInterface( - _main, - **COMMON_GRADIO_CONFIG, - examples=[ - [ - "Write a structured report on the history of AI, covering: " - "1) the start in the 50s, 2) the first AI winter, 3) the second AI winter, " - "4) the modern AI boom, 5) the evolution of AI hardware, and " - "6) the societal impacts of modern AI" - ], - [ - "Compare the box office performance of 'Oppenheimer' with the third Avatar movie" - ], + +async def _main( + query: str, history: list[ChatMessage], session_state: dict[str, Any] +) -> AsyncGenerator[list[ChatMessage], Any]: + # Initialize list of chat messages for a single turn + turn_messages: list[ChatMessage] = [] + + # Construct an in-memory SQLite session for the agent to maintain + # conversation history across multiple turns of a chat + # This makes it possible to ask follow-up questions that refer to + # previous turns in the conversation + session = get_or_create_agent_session(history, session_state) + + # Use the main agent as the entry point - not the worker agent. + main_agent = _get_main_agent() + with ( + langfuse_client.start_as_current_observation( + name="Orchestrator-Worker", as_type="agent", input=query + ) as obs, + propagate_attributes( + session_id=session.session_id # Propagate session_id to all child observations + ), + ): + # Run the agent in streaming mode to get and display intermediate outputs + result_stream = agents.Runner.run_streamed( + main_agent, + input=query, + session=session, + max_turns=30, # Increase max turns to support more complex queries + ) + + async for _item in result_stream.stream_events(): + turn_messages += oai_agent_stream_to_gradio_messages(_item) + if len(turn_messages) > 0: + yield turn_messages + + obs.update(output=result_stream.final_output) + + +demo = gr.ChatInterface( + _main, + **get_common_gradio_config(), + examples=[ + [ + "Write a structured report on the history of AI, covering: " + "1) the start in the 50s, 2) the first AI winter, 3) the second AI winter, " + "4) the modern AI boom, 5) the evolution of AI hardware, and " + "6) the societal impacts of modern AI" ], - title="2.2.3: Multi-Agent Orchestrator-worker for Retrieval-Augmented Generation with Multiple Tools", - ) + [ + "Compare the box office performance of 'Oppenheimer' with the third Avatar movie" + ], + ], + title="2.2.3: Multi-Agent Orchestrator-worker for Retrieval-Augmented Generation with Multiple Tools", +) - try: - demo.launch(share=True) - finally: - asyncio.run(client_manager.close()) +if __name__ == "__main__": + demo.launch(share=True) diff --git a/src/2_frameworks/2_multi_agent/fan_out.py b/implementations/2_frameworks/2_multi_agent/fan_out.py similarity index 93% rename from src/2_frameworks/2_multi_agent/fan_out.py rename to implementations/2_frameworks/2_multi_agent/fan_out.py index b7400135..f1720c50 100644 --- a/src/2_frameworks/2_multi_agent/fan_out.py +++ b/implementations/2_frameworks/2_multi_agent/fan_out.py @@ -9,7 +9,7 @@ - revise error handling- at the moment, if there is an exception, the pair would be skipped. You might want to set up e.g., retry. -uv run --env-file .env src/2_frameworks/2_multi_agent/fan_out.py \ +uv run --env-file .env implementations/2_frameworks/2_multi_agent/fan_out.py \ --source_dataset laliyepeng/test-cra-dataset \ --num_rows 10 \ --output_report report.md @@ -26,12 +26,14 @@ import datasets import openai import pydantic +from aieng.agents import gather_with_progress, rate_limited, set_up_logging +from aieng.agents.async_utils import register_async_cleanup +from aieng.agents.client_manager import AsyncClientManager +from aieng.agents.langfuse import langfuse_client, setup_langfuse_tracer -from src.utils import set_up_logging, setup_langfuse_tracer -from src.utils.async_utils import gather_with_progress, rate_limited -from src.utils.client_manager import AsyncClientManager -from src.utils.langfuse.shared_client import langfuse_client +# Set logging level and suppress some noisy logs from dependencies +set_up_logging() MAX_CONCURRENCY = {"worker": 50, "reviewer": 50} MAX_GENERATED_TOKENS = {"worker": 16384, "reviewer": 32768} @@ -336,11 +338,17 @@ async def process_conflict_reviews( parser.add_argument("--output_report", default="report.md") args = parser.parse_args() - set_up_logging() setup_langfuse_tracer() + # Initialize client manager + # This class initializes the OpenAI and Weaviate async clients, as well as the + # Weaviate knowledge base tool. The initialization is done once when the clients + # are first accessed, and the clients are reused for subsequent calls. client_manager = AsyncClientManager() + # Register async cleanup to ensure clients are properly closed on program exit + register_async_cleanup(client_manager) + worker_agent = agents.Agent( "Conflict-detection Agent", instructions=( diff --git a/src/2_frameworks/2_multi_agent/verbose.py b/implementations/2_frameworks/2_multi_agent/verbose.py similarity index 70% rename from src/2_frameworks/2_multi_agent/verbose.py rename to implementations/2_frameworks/2_multi_agent/verbose.py index 29d426db..11e3282f 100644 --- a/src/2_frameworks/2_multi_agent/verbose.py +++ b/implementations/2_frameworks/2_multi_agent/verbose.py @@ -7,62 +7,48 @@ Log traces to LangFuse for observability and evaluation. """ -import asyncio from typing import Any, AsyncGenerator import agents import gradio as gr +from aieng.agents import ( + get_or_create_agent_session, + oai_agent_items_to_gradio_messages, + pretty_print, + register_async_cleanup, + set_up_logging, +) +from aieng.agents.client_manager import AsyncClientManager +from aieng.agents.gradio import get_common_gradio_config +from aieng.agents.langfuse import langfuse_client, setup_langfuse_tracer +from aieng.agents.prompts import ( + KB_RESEARCHER_INSTRUCTIONS, + WIKI_SEARCH_PLANNER_INSTRUCTIONS, + WRITER_INSTRUCTIONS, +) from dotenv import load_dotenv from gradio.components.chatbot import ChatMessage from langfuse import propagate_attributes from pydantic import BaseModel -from src.utils import ( - oai_agent_items_to_gradio_messages, - pretty_print, - setup_langfuse_tracer, -) -from src.utils.agent_session import get_or_create_session -from src.utils.client_manager import AsyncClientManager -from src.utils.gradio import COMMON_GRADIO_CONFIG -from src.utils.langfuse.shared_client import langfuse_client -from src.utils.logging import set_up_logging - - -PLANNER_INSTRUCTIONS = """\ -You are a research planner. \ -Given a user's query, produce a list of search terms that can be used to retrieve -relevant information from a knowledge base to answer the question. \ -As you are not able to clarify from the user what they are looking for, \ -your search terms should be broad and cover various aspects of the query. \ -Output up to 10 search terms to query the knowledge base. \ -Note that the knowledge base is a Wikipedia dump and cuts off at May 2025. -""" -RESEARCHER_INSTRUCTIONS = """\ -You are a research assistant with access to a knowledge base. \ -Given a potentially broad search term, your task is to use the search tool to \ -retrieve relevant information from the knowledge base and produce a short \ -summary of at most 300 words. You must pass the initial search term directly to \ -the search tool without any modifications and, only if necessary, refine your \ -search based on the results you get back. Your summary must be based solely on \ -a synthesis of all the search results and should not include any information that \ -is not present in the search results. For every fact you include in the summary, \ -ALWAYS include a citation both in-line and at the end of the summary as a numbered \ -list. The citation at the end should include relevant metadata from the search \ -results. Do NOT return raw search results. -""" +load_dotenv(verbose=True) -WRITER_INSTRUCTIONS = """\ -You are an expert at synthesizing information and writing coherent reports. \ -Given a user's query and a set of search summaries, synthesize these into a \ -coherent report that answers the user's question. The length of the report should be \ -proportional to the complexity of the question. For queries that are more complex, \ -ensure that the report is well-structured, with clear sections and headings where \ -appropriate. Make sure to use the citations from the search summaries to back up \ -any factual claims you make. \ -Do not make up any information outside of the search summaries. -""" +# Set logging level and suppress some noisy logs from dependencies +set_up_logging() + +if gr.NO_RELOAD: + # Set up LangFuse for tracing + setup_langfuse_tracer() + + # Initialize client manager + # This class initializes the OpenAI and Weaviate async clients, as well as the + # Weaviate knowledge base tool. The initialization is done once when the clients + # are first accessed, and the clients are reused for subsequent calls. + client_manager = AsyncClientManager() + + # Register async cleanup to ensure clients are properly closed on program exit + register_async_cleanup(client_manager) class SearchItem(BaseModel): @@ -98,6 +84,47 @@ class ResearchReport(BaseModel): full_report: str +def _get_agents() -> tuple[agents.Agent, agents.Agent, agents.Agent]: + # Use smaller, faster model for focused search tasks + worker_model = client_manager.configs.default_worker_model + # Use larger, more capable model for complex planning and reasoning + planner_model = client_manager.configs.default_planner_model + + planner_agent = agents.Agent( + name="Planner Agent", + instructions=WIKI_SEARCH_PLANNER_INSTRUCTIONS, + model=agents.OpenAIChatCompletionsModel( + model=planner_model, + openai_client=client_manager.openai_client, + ), + output_type=SearchPlan, + ) + + research_agent = agents.Agent( + name="Research Agent", + instructions=KB_RESEARCHER_INSTRUCTIONS, + tools=[agents.function_tool(client_manager.knowledgebase.search_knowledgebase)], + model=agents.OpenAIChatCompletionsModel( + model=worker_model, + openai_client=client_manager.openai_client, + ), + # Force the agent to use the search tool for every query + model_settings=agents.ModelSettings(tool_choice="required"), + ) + + writer_agent = agents.Agent( + name="Writer Agent", + instructions=WRITER_INSTRUCTIONS, + model=agents.OpenAIChatCompletionsModel( + model=planner_model, # Stronger model for complex synthesis + openai_client=client_manager.openai_client, + ), + output_type=ResearchReport, + ) + + return planner_agent, research_agent, writer_agent + + async def _create_search_plan( planner_agent: agents.Agent, query: str, session: agents.Session | None = None ) -> SearchPlan: @@ -139,7 +166,10 @@ async def _main( # conversation history across multiple turns of a chat # This makes it possible to ask follow-up questions that refer to # previous turns in the conversation - session = get_or_create_session(history, session_state) + session = get_or_create_agent_session(history, session_state) + + # Get the agents + planner_agent, research_agent, writer_agent = _get_agents() with ( langfuse_client.start_as_current_observation( @@ -220,76 +250,22 @@ async def _main( yield turn_messages -if __name__ == "__main__": - load_dotenv(verbose=True) - - # Set logging level and suppress some noisy logs from dependencies - set_up_logging() - - # Set up LangFuse for tracing - setup_langfuse_tracer() - - # Initialize client manager - # This class initializes the OpenAI and Weaviate async clients, as well as the - # Weaviate knowledge base tool. The initialization is done once when the clients - # are first accessed, and the clients are reused for subsequent calls. - client_manager = AsyncClientManager() - - # Use smaller, faster model for focused search tasks - worker_model = client_manager.configs.default_worker_model - # Use larger, more capable model for complex planning and reasoning - planner_model = client_manager.configs.default_planner_model - - planner_agent = agents.Agent( - name="Planner Agent", - instructions=PLANNER_INSTRUCTIONS, - model=agents.OpenAIChatCompletionsModel( - model=planner_model, - openai_client=client_manager.openai_client, - ), - output_type=SearchPlan, - ) - - research_agent = agents.Agent( - name="Research Agent", - instructions=RESEARCHER_INSTRUCTIONS, - tools=[agents.function_tool(client_manager.knowledgebase.search_knowledgebase)], - model=agents.OpenAIChatCompletionsModel( - model=worker_model, - openai_client=client_manager.openai_client, - ), - # Force the agent to use the search tool for every query - model_settings=agents.ModelSettings(tool_choice="required"), - ) - - writer_agent = agents.Agent( - name="Writer Agent", - instructions=WRITER_INSTRUCTIONS, - model=agents.OpenAIChatCompletionsModel( - model=planner_model, # Stronger model for complex synthesis - openai_client=client_manager.openai_client, - ), - output_type=ResearchReport, - ) - - demo = gr.ChatInterface( - _main, - **COMMON_GRADIO_CONFIG, - examples=[ - [ - "Write a structured report on the history of AI, covering: " - "1) the start in the 50s, 2) the first AI winter, 3) the second AI winter, " - "4) the modern AI boom, 5) the evolution of AI hardware, and " - "6) the societal impacts of modern AI" - ], - [ - "Compare the box office performance of 'Oppenheimer' with the third Avatar movie" - ], +demo = gr.ChatInterface( + _main, + **get_common_gradio_config(), + examples=[ + [ + "Write a structured report on the history of AI, covering: " + "1) the start in the 50s, 2) the first AI winter, 3) the second AI winter, " + "4) the modern AI boom, 5) the evolution of AI hardware, and " + "6) the societal impacts of modern AI" ], - title="2.2.1: Plan-and-Execute Multi-Agent System for Retrieval-Augmented Generation", - ) + [ + "Compare the box office performance of 'Oppenheimer' with the third Avatar movie" + ], + ], + title="2.2.1: Plan-and-Execute Multi-Agent System for Retrieval-Augmented Generation", +) - try: - demo.launch(share=True) - finally: - asyncio.run(client_manager.close()) +if __name__ == "__main__": + demo.launch(share=True) diff --git a/src/2_frameworks/3_code_interpreter/README.md b/implementations/2_frameworks/3_code_interpreter/README.md similarity index 64% rename from src/2_frameworks/3_code_interpreter/README.md rename to implementations/2_frameworks/3_code_interpreter/README.md index 172c0aa0..443e4dcf 100644 --- a/src/2_frameworks/3_code_interpreter/README.md +++ b/implementations/2_frameworks/3_code_interpreter/README.md @@ -8,5 +8,5 @@ Prerequisites: Run: ```bash -uv run --env-file .env gradio src/2_frameworks/3_code_interpreter/app.py +uv run --env-file .env gradio implementations/2_frameworks/3_code_interpreter/app.py ``` diff --git a/src/2_frameworks/3_code_interpreter/app.py b/implementations/2_frameworks/3_code_interpreter/app.py similarity index 60% rename from src/2_frameworks/3_code_interpreter/app.py rename to implementations/2_frameworks/3_code_interpreter/app.py index 893b0a7f..82f9c4ef 100644 --- a/src/2_frameworks/3_code_interpreter/app.py +++ b/implementations/2_frameworks/3_code_interpreter/app.py @@ -5,46 +5,69 @@ You will need your E2B API Key. """ -import asyncio from pathlib import Path from typing import Any, AsyncGenerator import agents import gradio as gr +from aieng.agents import ( + get_or_create_agent_session, + oai_agent_stream_to_gradio_messages, + pretty_print, + register_async_cleanup, + set_up_logging, +) +from aieng.agents.client_manager import AsyncClientManager +from aieng.agents.gradio import get_common_gradio_config +from aieng.agents.langfuse import langfuse_client, setup_langfuse_tracer +from aieng.agents.prompts import CODE_INTERPRETER_INSTRUCTIONS +from aieng.agents.tools import CodeInterpreter from dotenv import load_dotenv from gradio.components.chatbot import ChatMessage from langfuse import propagate_attributes -from src.utils import ( - CodeInterpreter, - oai_agent_stream_to_gradio_messages, - set_up_logging, -) -from src.utils.agent_session import get_or_create_session -from src.utils.client_manager import AsyncClientManager -from src.utils.gradio import COMMON_GRADIO_CONFIG -from src.utils.langfuse.oai_sdk_setup import setup_langfuse_tracer -from src.utils.langfuse.shared_client import langfuse_client -from src.utils.pretty_printing import pretty_print +load_dotenv(verbose=True) -CODE_INTERPRETER_INSTRUCTIONS = """\ -The `code_interpreter` tool executes Python commands. \ -Please note that data is not persisted. Each time you invoke this tool, \ -you will need to run import and define all variables from scratch. +# Set logging level and suppress some noisy logs from dependencies +set_up_logging() -You can access the local filesystem using this tool. \ -Instead of asking the user for file inputs, you should try to find the file \ -using this tool. +if gr.NO_RELOAD: + # Set up LangFuse for tracing + setup_langfuse_tracer() -Recommended packages: Pandas, Numpy, SymPy, Scikit-learn, Matplotlib, Seaborn. + # Initialize client manager + # This class initializes the OpenAI and Weaviate async clients, as well as the + # Weaviate knowledge base tool. The initialization is done once when the clients + # are first accessed, and the clients are reused for subsequent calls. + client_manager = AsyncClientManager() -Use Matplotlib to create visualizations. Make sure to call `plt.show()` so that -the plot is captured and returned to the user. + # Register async cleanup to ensure clients are properly closed on program exit + register_async_cleanup(client_manager) -You can also run Jupyter-style shell commands (e.g., `!pip freeze`) -but you won't be able to install packages. -""" + +def _get_main_agent() -> agents.Agent: + # Initialize code interpreter with local files that will be available to the agent + code_interpreter = CodeInterpreter( + local_files=[ + Path("sandbox_content/"), + Path("aieng-agents/tests/example_files/example_a.csv"), + ] + ) + + return agents.Agent( + name="Data Analysis Agent", + instructions=CODE_INTERPRETER_INSTRUCTIONS, + tools=[ + agents.function_tool( + code_interpreter.run_code, name_override="code_interpreter" + ) + ], + model=agents.OpenAIChatCompletionsModel( + model=client_manager.configs.default_planner_model, + openai_client=client_manager.openai_client, + ), + ) async def _main( @@ -57,7 +80,9 @@ async def _main( # conversation history across multiple turns of a chat # This makes it possible to ask follow-up questions that refer to # previous turns in the conversation - session = get_or_create_session(history, session_state) + session = get_or_create_agent_session(history, session_state) + + main_agent = _get_main_agent() with ( langfuse_client.start_as_current_observation( @@ -86,52 +111,16 @@ async def _main( turn_messages.clear() -if __name__ == "__main__": - load_dotenv(verbose=True) - - set_up_logging() - setup_langfuse_tracer() - - # Initialize client manager - # This class initializes the OpenAI and Weaviate async clients, as well as the - # Weaviate knowledge base tool. The initialization is done once when the clients - # are first accessed, and the clients are reused for subsequent calls. - client_manager = AsyncClientManager() - - # Initialize code interpreter with local files that will be available to the agent - code_interpreter = CodeInterpreter( - local_files=[ - Path("sandbox_content/"), - Path("tests/tool_tests/example_files/example_a.csv"), - ] - ) - - main_agent = agents.Agent( - name="Data Analysis Agent", - instructions=CODE_INTERPRETER_INSTRUCTIONS, - tools=[ - agents.function_tool( - code_interpreter.run_code, name_override="code_interpreter" - ) - ], - model=agents.OpenAIChatCompletionsModel( - model=client_manager.configs.default_planner_model, - openai_client=client_manager.openai_client, - ), - ) - - demo = gr.ChatInterface( - _main, - **COMMON_GRADIO_CONFIG, - examples=[ - ["What is the sum of the column `x` in this example_a.csv?"], - ["What is the sum of the column `y` in this example_a.csv?"], - ["Create a linear best-fit line for the data in example_a.csv."], - ], - title="2.3. OAI Agent SDK ReAct + Code Interpreter Tool", - ) +demo = gr.ChatInterface( + _main, + **get_common_gradio_config(), + examples=[ + ["What is the sum of the column `x` in this example_a.csv?"], + ["What is the sum of the column `y` in this example_a.csv?"], + ["Create a linear best-fit line for the data in example_a.csv."], + ], + title="2.3. OAI Agent SDK ReAct + Code Interpreter Tool", +) - try: - demo.launch(share=True) - finally: - asyncio.run(client_manager.close()) +if __name__ == "__main__": + demo.launch(share=True) diff --git a/src/2_frameworks/4_mcp/README.md b/implementations/2_frameworks/4_mcp/README.md similarity index 78% rename from src/2_frameworks/4_mcp/README.md rename to implementations/2_frameworks/4_mcp/README.md index bb11606c..00334a20 100644 --- a/src/2_frameworks/4_mcp/README.md +++ b/implementations/2_frameworks/4_mcp/README.md @@ -2,9 +2,8 @@ This folder introduces use of Model Context Protocol (MCP) Servers to allow agents to access data and tools. The `mcp-server-git` MCP server is provided to the agent with limited tool use so it can use `git` commands in the repo. - -# Running +## Running ```bash -uv run --env-file .env gradio src/2_frameworks/4_mcp/app.py +uv run --env-file .env gradio implementations/2_frameworks/4_mcp/app.py ``` diff --git a/src/2_frameworks/4_mcp/app.py b/implementations/2_frameworks/4_mcp/app.py similarity index 77% rename from src/2_frameworks/4_mcp/app.py rename to implementations/2_frameworks/4_mcp/app.py index d3e71b91..8221c573 100644 --- a/src/2_frameworks/4_mcp/app.py +++ b/implementations/2_frameworks/4_mcp/app.py @@ -3,27 +3,44 @@ Log traces to LangFuse for observability and evaluation. """ -import asyncio import subprocess from typing import Any, AsyncGenerator import agents import gradio as gr from agents.mcp import MCPServerStdio, create_static_tool_filter -from dotenv import load_dotenv -from gradio.components.chatbot import ChatMessage -from langfuse import propagate_attributes - -from src.utils import ( +from aieng.agents import ( + get_or_create_agent_session, oai_agent_stream_to_gradio_messages, pretty_print, + register_async_cleanup, set_up_logging, ) -from src.utils.agent_session import get_or_create_session -from src.utils.client_manager import AsyncClientManager -from src.utils.gradio import COMMON_GRADIO_CONFIG -from src.utils.langfuse.oai_sdk_setup import setup_langfuse_tracer -from src.utils.langfuse.shared_client import langfuse_client +from aieng.agents.client_manager import AsyncClientManager +from aieng.agents.gradio import get_common_gradio_config +from aieng.agents.langfuse import langfuse_client, setup_langfuse_tracer +from dotenv import load_dotenv +from gradio.components.chatbot import ChatMessage +from langfuse import propagate_attributes + + +load_dotenv(verbose=True) + +# Set logging level and suppress some noisy logs from dependencies +set_up_logging() + +if gr.NO_RELOAD: + # Set up LangFuse for tracing + setup_langfuse_tracer() + + # Initialize client manager + # This class initializes the OpenAI and Weaviate async clients, as well as the + # Weaviate knowledge base tool. The initialization is done once when the clients + # are first accessed, and the clients are reused for subsequent calls. + client_manager = AsyncClientManager() + + # Register async cleanup to ensure clients are properly closed on program exit + register_async_cleanup(client_manager) async def _main( @@ -37,7 +54,7 @@ async def _main( # conversation history across multiple turns of a chat # This makes it possible to ask follow-up questions that refer to # previous turns in the conversation - session = get_or_create_session(history, session_state) + session = get_or_create_agent_session(history, session_state) # Get the absolute path to the current git repository, regardless of where # the script is run from @@ -90,29 +107,15 @@ async def _main( turn_messages.clear() -if __name__ == "__main__": - load_dotenv(verbose=True) - - set_up_logging() - setup_langfuse_tracer() - - # Initialize client manager - # This class initializes the OpenAI and Weaviate async clients, as well as the - # Weaviate knowledge base tool. The initialization is done once when the clients - # are first accessed, and the clients are reused for subsequent calls. - client_manager = AsyncClientManager() +demo = gr.ChatInterface( + _main, + **get_common_gradio_config(), + examples=[ + ["Summarize the last change in the repository."], + ["How many branches currently exist on the remote?"], + ], + title="2.4 OAI Agent SDK + Git MCP Server", +) - demo = gr.ChatInterface( - _main, - **COMMON_GRADIO_CONFIG, - examples=[ - ["Summarize the last change in the repository."], - ["How many branches currently exist on the remote?"], - ], - title="2.4 OAI Agent SDK + Git MCP Server", - ) - - try: - demo.launch(share=True) - finally: - asyncio.run(client_manager.close()) +if __name__ == "__main__": + demo.launch(share=True) diff --git a/src/2_frameworks/__init__.py b/implementations/2_frameworks/__init__.py similarity index 100% rename from src/2_frameworks/__init__.py rename to implementations/2_frameworks/__init__.py diff --git a/src/3_evals/1_llm_judge/README.md b/implementations/3_evals/1_llm_judge/README.md similarity index 96% rename from src/3_evals/1_llm_judge/README.md rename to implementations/3_evals/1_llm_judge/README.md index a6934137..3c65e279 100644 --- a/src/3_evals/1_llm_judge/README.md +++ b/implementations/3_evals/1_llm_judge/README.md @@ -7,12 +7,11 @@ Run in the following steps: - Create Langfuse "dataset" and upload test data to Langfuse - Run each agent variation on the test dataset, linking the traces to the dataset run. - ## Create and Populate Dataset ```bash uv run --env-file .env \ --m src.3_evals.1_llm_judge.upload_data \ +-m implementations.3_evals.1_llm_judge.upload_data \ --source_dataset hf://vector-institute/hotpotqa@d997ecf:train \ --langfuse_dataset_name search-dataset \ --limit 18 @@ -35,7 +34,7 @@ Example data: ```bash uv run --env-file .env \ --m src.3_evals.1_llm_judge.run_eval \ +-m implementations.3_evals.1_llm_judge.run_eval \ --langfuse_dataset_name search-dataset \ --run_name enwiki_weaviate ``` diff --git a/src/3_evals/1_llm_judge/run_eval.py b/implementations/3_evals/1_llm_judge/run_eval.py similarity index 94% rename from src/3_evals/1_llm_judge/run_eval.py rename to implementations/3_evals/1_llm_judge/run_eval.py index a6beb0b7..c04a7e8b 100644 --- a/src/3_evals/1_llm_judge/run_eval.py +++ b/implementations/3_evals/1_llm_judge/run_eval.py @@ -5,18 +5,18 @@ import agents import pydantic +from aieng.agents import gather_with_progress, set_up_logging, setup_langfuse_tracer +from aieng.agents.client_manager import AsyncClientManager +from aieng.agents.langfuse import flush_langfuse, langfuse_client from dotenv import load_dotenv from langfuse._client.datasets import DatasetItemClient from rich.progress import track -from src.utils import ( - gather_with_progress, - set_up_logging, - setup_langfuse_tracer, -) -from src.utils.client_manager import AsyncClientManager -from src.utils.langfuse.shared_client import flush_langfuse, langfuse_client +load_dotenv(verbose=True) + +# Set logging level and suppress some noisy logs from dependencies +set_up_logging() SYSTEM_MESSAGE = """\ Answer the question using the search tool. \ @@ -187,9 +187,6 @@ async def _main() -> None: parser.add_argument("--limit", type=int) args = parser.parse_args() - load_dotenv(verbose=True) - set_up_logging() - setup_langfuse_tracer() client_manager = AsyncClientManager() diff --git a/src/3_evals/1_llm_judge/upload_data.py b/implementations/3_evals/1_llm_judge/upload_data.py similarity index 54% rename from src/3_evals/1_llm_judge/upload_data.py rename to implementations/3_evals/1_llm_judge/upload_data.py index b6ecbf7a..f10befb7 100644 --- a/src/3_evals/1_llm_judge/upload_data.py +++ b/implementations/3_evals/1_llm_judge/upload_data.py @@ -6,14 +6,14 @@ import argparse +from aieng.agents import Configs +from aieng.agents.data import get_dataset, get_dataset_url_hash +from aieng.agents.langfuse import langfuse_client, set_up_langfuse_otlp_env_vars from dotenv import load_dotenv from rich.progress import track -from src.utils.data import get_dataset, get_dataset_url_hash -from src.utils.env_vars import Configs -from src.utils.langfuse.otlp_env_setup import set_up_langfuse_otlp_env_vars -from src.utils.langfuse.shared_client import langfuse_client +load_dotenv(verbose=True) if __name__ == "__main__": parser = argparse.ArgumentParser() @@ -22,8 +22,6 @@ parser.add_argument("--limit", type=int) args = parser.parse_args() - load_dotenv(verbose=True) - configs = Configs() set_up_langfuse_otlp_env_vars() @@ -32,22 +30,28 @@ # Create a dataset in Langfuse assert langfuse_client.auth_check() - langfuse_client.create_dataset( - name=args.langfuse_dataset_name, - description=f"[{dataset_url_hash}] Data from {args.source_dataset}", - metadata={ - "url_hash": dataset_url_hash, - "source": args.source_dataset, - "type": "benchmark", - }, - ) + try: + langfuse_client.create_dataset( + name=args.langfuse_dataset_name, + description=f"[{dataset_url_hash}] Data from {args.source_dataset}", + metadata={ + "url_hash": dataset_url_hash, + "source": args.source_dataset, + "type": "benchmark", + }, + ) + except Exception as exc: + # We only continue if the dataset can be retrieved + try: + langfuse_client.get_dataset(args.langfuse_dataset_name) + print(f"Dataset {args.langfuse_dataset_name} already exists; continuing.") + except Exception as e: + raise exc from e df = get_dataset(args.source_dataset, limit=args.limit) for idx, row in track( - df.iterrows(), - total=len(df), - description="Uploading to Langfuse", + df.iterrows(), total=len(df), description="Uploading to Langfuse" ): langfuse_client.create_dataset_item( dataset_name=args.langfuse_dataset_name, diff --git a/src/3_evals/2_synthetic_data/README.md b/implementations/3_evals/2_synthetic_data/README.md similarity index 77% rename from src/3_evals/2_synthetic_data/README.md rename to implementations/3_evals/2_synthetic_data/README.md index 99a1b4e0..4b7aeb8f 100644 --- a/src/3_evals/2_synthetic_data/README.md +++ b/implementations/3_evals/2_synthetic_data/README.md @@ -1,9 +1,9 @@ # Generate synthetic data using Agent Pipeline ```bash -uv run -m src.3_evals.2_synthetic_data.synthesize_data \ +uv run --env-file .env -m implementations.3_evals.2_synthetic_data.synthesize_data \ --source_dataset hf://vector-institute/hotpotqa@d997ecf:train \ ---langfuse_dataset_name search-dataset-synthetic-20250609 \ +--langfuse_dataset_name search-dataset-synthetic \ --limit 18 ``` @@ -13,15 +13,15 @@ uv run -m src.3_evals.2_synthetic_data.synthesize_data \ # Baseline: "Real" dataset uv run \ --env-file .env \ --m src.3_evals.2_synthetic_data.annotate_diversity \ +-m implementations.3_evals.2_synthetic_data.annotate_diversity \ --langfuse_dataset_name search-dataset \ --run_name cosine_similarity_bge_m3 # Synthetic dataset uv run \ --env-file .env \ --m src.3_evals.2_synthetic_data.annotate_diversity \ ---langfuse_dataset_name search-dataset-synthetic-20250609 \ +-m implementations.3_evals.2_synthetic_data.annotate_diversity \ +--langfuse_dataset_name search-dataset-synthetic \ --run_name cosine_similarity_bge_m3 ``` @@ -46,7 +46,7 @@ Uploading scores... ━━━━━━━━━━━━━━━━━━━━ # synthetic, default temperature, etc. Items to process: 80 Embedding ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 100% 0:00:09 0:00:00 -Cosine similarity of search-dataset-synthetic-20250609 +Cosine similarity of search-dataset-synthetic count 80.000000 mean 0.350789 std 0.027978 @@ -64,8 +64,8 @@ Uploading scores... ━━━━━━━━━━━━━━━━━━━━ ```bash uv run \ --env-file .env \ --m src.3_evals.1_llm_judge.run_eval \ ---langfuse_dataset_name search-dataset-synthetic-20250609 \ +-m implementations.3_evals.1_llm_judge.run_eval \ +--langfuse_dataset_name search-dataset-synthetic \ --run_name enwiki_weaviate \ --limit 18 ``` diff --git a/src/3_evals/2_synthetic_data/annotate_diversity.py b/implementations/3_evals/2_synthetic_data/annotate_diversity.py similarity index 95% rename from src/3_evals/2_synthetic_data/annotate_diversity.py rename to implementations/3_evals/2_synthetic_data/annotate_diversity.py index 2e9ed3d3..da29d55f 100644 --- a/src/3_evals/2_synthetic_data/annotate_diversity.py +++ b/implementations/3_evals/2_synthetic_data/annotate_diversity.py @@ -4,7 +4,7 @@ uv run \ --env-file .env \ --m src.3_evals.2_synthetic_data.annotate_diversity \ +-m implementations.3_evals.2_synthetic_data.annotate_diversity \ --langfuse_dataset_name ${DATASET_NAME} \ --run_name cosine_similarity_bge_m3_20250716 \ --limit 18 @@ -17,12 +17,12 @@ import numpy as np import pandas as pd import pydantic +from aieng.agents import Configs, gather_with_progress +from aieng.agents.data import create_batches +from aieng.agents.langfuse import flush_langfuse, langfuse_client from openai import AsyncOpenAI from rich.progress import track -from src.utils import Configs, create_batches, gather_with_progress -from src.utils.langfuse.shared_client import flush_langfuse, langfuse_client - if TYPE_CHECKING: from langfuse._client.datasets import DatasetItemClient diff --git a/src/3_evals/2_synthetic_data/gradio_visualize_diversity.py b/implementations/3_evals/2_synthetic_data/gradio_visualize_diversity.py similarity index 84% rename from src/3_evals/2_synthetic_data/gradio_visualize_diversity.py rename to implementations/3_evals/2_synthetic_data/gradio_visualize_diversity.py index 5f94a81b..34bf1cbd 100644 --- a/src/3_evals/2_synthetic_data/gradio_visualize_diversity.py +++ b/implementations/3_evals/2_synthetic_data/gradio_visualize_diversity.py @@ -4,7 +4,7 @@ uv run \ --env-file .env \ -gradio src/3_evals/2_synthetic_data/gradio_visualize_diversity.py +gradio implementations/3_evals/2_synthetic_data/gradio_visualize_diversity.py """ from typing import List @@ -12,14 +12,14 @@ import gradio as gr import numpy as np import plotly.express as px +from aieng.agents import Configs, gather_with_progress +from aieng.agents.data import create_batches +from aieng.agents.langfuse import langfuse_client from openai import AsyncOpenAI from plotly.graph_objs import Figure from sklearn.decomposition import PCA from sklearn.manifold import TSNE -from src.utils import Configs, create_batches, gather_with_progress -from src.utils.langfuse.shared_client import langfuse_client - def reduce_dimensions( embeddings: np.ndarray, method: str = "tsne", n_components: int = 2 @@ -123,17 +123,17 @@ async def get_projection_plot( ) -if __name__ == "__main__": - viewer = gr.Interface( - fn=get_projection_plot, - inputs=[ - gr.Textbox(label="Dataset name"), - gr.Radio(["tsne", "pca"], label="Dimensionality Reduction Method"), - gr.Number(value=18, label="Number of rows to plot", minimum=1), - ], - outputs=gr.Plot(label="2D Embedding Plot"), - title="3.2 Text Embedding Visualizer", - description="Select a method to visualize 256-D embeddings of text snippets.", - ) +demo = gr.Interface( + fn=get_projection_plot, + inputs=[ + gr.Textbox(label="Dataset name"), + gr.Radio(["tsne", "pca"], label="Dimensionality Reduction Method"), + gr.Number(value=18, label="Number of rows to plot", minimum=1), + ], + outputs=gr.Plot(label="2D Embedding Plot"), + title="3.2 Text Embedding Visualizer", + description="Select a method to visualize 256-D embeddings of text snippets.", +) - viewer.launch(share=True) +if __name__ == "__main__": + demo.launch(share=True) diff --git a/src/3_evals/2_synthetic_data/synthesize_data.py b/implementations/3_evals/2_synthetic_data/synthesize_data.py similarity index 84% rename from src/3_evals/2_synthetic_data/synthesize_data.py rename to implementations/3_evals/2_synthetic_data/synthesize_data.py index b5ad73fb..14cce0ee 100644 --- a/src/3_evals/2_synthetic_data/synthesize_data.py +++ b/implementations/3_evals/2_synthetic_data/synthesize_data.py @@ -13,22 +13,26 @@ import agents import pydantic -from dotenv import load_dotenv -from rich.progress import track - -from src.utils import ( +from aieng.agents import ( gather_with_progress, pretty_print, rate_limited, set_up_logging, setup_langfuse_tracer, ) -from src.utils.client_manager import AsyncClientManager -from src.utils.data import get_dataset, get_dataset_url_hash -from src.utils.langfuse.shared_client import langfuse_client -from src.utils.tools.news_events import NewsEvent, get_news_events +from aieng.agents.client_manager import AsyncClientManager +from aieng.agents.data import get_dataset, get_dataset_url_hash +from aieng.agents.langfuse import langfuse_client +from aieng.agents.tools import NewsEvent, get_news_events +from dotenv import load_dotenv +from rich.progress import track +load_dotenv(verbose=True) + +# Set logging level and suppress some noisy logs from dependencies +set_up_logging() + SYSTEM_MESSAGE = """\ Example questions: \ {example_questions} @@ -113,9 +117,6 @@ async def generate_synthetic_test_cases( parser.add_argument("--max_concurrency", type=int, default=3) args = parser.parse_args() - load_dotenv(verbose=True) - set_up_logging() - setup_langfuse_tracer() generator = random.Random(0) @@ -124,15 +125,23 @@ async def generate_synthetic_test_cases( client_manager = AsyncClientManager() # Create langfuse dataset and upload. - langfuse_client.create_dataset( - name=args.langfuse_dataset_name, - description=f"[{dataset_name_hash}] Synthetic data based on {args.source_dataset}", - metadata={ - "name_hash": dataset_name_hash, - "reference_source": args.source_dataset, - "type": "synthetic_benchmark", - }, - ) + try: + langfuse_client.create_dataset( + name=args.langfuse_dataset_name, + description=f"[{dataset_name_hash}] Synthetic data based on {args.source_dataset}", + metadata={ + "name_hash": dataset_name_hash, + "reference_source": args.source_dataset, + "type": "synthetic_benchmark", + }, + ) + except Exception as exc: + # We only continue if the dataset can be retrieved + try: + langfuse_client.get_dataset(args.langfuse_dataset_name) + print(f"Dataset {args.langfuse_dataset_name} already exists; continuing.") + except Exception as e: + raise exc from e df = get_dataset(args.source_dataset, limit=90) rows_news_only = [row.to_dict() for _, row in df.iterrows()] diff --git a/src/3_evals/2_synthetic_data/synthesize_data_e2b.py b/implementations/3_evals/2_synthetic_data/synthesize_data_e2b.py similarity index 83% rename from src/3_evals/2_synthetic_data/synthesize_data_e2b.py rename to implementations/3_evals/2_synthetic_data/synthesize_data_e2b.py index 1f1d5dd9..f293a0e4 100644 --- a/src/3_evals/2_synthetic_data/synthesize_data_e2b.py +++ b/implementations/3_evals/2_synthetic_data/synthesize_data_e2b.py @@ -9,8 +9,8 @@ Example: ``` -uv run --env-file .env src/3_evals/2_synthetic_data/synthesize_data_e2b.py \ ---langfuse_dataset_name e2b-synthetic-20251113-1a \ +uv run --env-file .env implementations/3_evals/2_synthetic_data/synthesize_data_e2b.py \ +--langfuse_dataset_name e2b-synthetic \ --limit 36 \ --max_concurrency 20 ``` @@ -24,21 +24,24 @@ import agents import pydantic -from dotenv import load_dotenv -from rich.progress import track - -from src.utils import ( - CodeInterpreter, +from aieng.agents import ( gather_with_progress, pretty_print, rate_limited, set_up_logging, - setup_langfuse_tracer, ) -from src.utils.client_manager import AsyncClientManager -from src.utils.data import get_dataset_url_hash -from src.utils.langfuse.shared_client import langfuse_client +from aieng.agents.client_manager import AsyncClientManager +from aieng.agents.data import get_dataset_url_hash +from aieng.agents.langfuse import langfuse_client, setup_langfuse_tracer +from aieng.agents.tools import CodeInterpreter +from dotenv import load_dotenv +from rich.progress import track + +load_dotenv(verbose=True) + +# Set logging level and suppress some noisy logs from dependencies +set_up_logging() SYSTEM_MESSAGE = """\ Example questions: \ @@ -101,8 +104,7 @@ async def generate_synthetic_test_cases( input="Generate test question-answer pairs based on files under /data", ) structured_response = await agents.Runner.run( - structured_output_agent, - input=raw_response.final_output, + structured_output_agent, input=raw_response.final_output ) return structured_response.final_output_as(list[_SyntheticTestCase]) @@ -118,10 +120,6 @@ async def generate_synthetic_test_cases( parser.add_argument("--max_concurrency", type=int, default=3) args = parser.parse_args() - load_dotenv(verbose=True) - - set_up_logging() - client_manager = AsyncClientManager() setup_langfuse_tracer() @@ -129,7 +127,7 @@ async def generate_synthetic_test_cases( template_name=client_manager.configs.default_code_interpreter_template, local_files=[ Path("sandbox_content/"), - Path("tests/tool_tests/example_files/example_a.csv"), + Path("aieng-agents/tests/example_files/example_a.csv"), ], ) @@ -174,14 +172,19 @@ async def generate_synthetic_test_cases( dataset_name_hash = get_dataset_url_hash(args.langfuse_dataset_name) # Create langfuse dataset and upload. - langfuse_client.create_dataset( - name=args.langfuse_dataset_name, - description=f"[{dataset_name_hash}] Synthetic data", - metadata={ - "name_hash": dataset_name_hash, - "type": "synthetic_benchmark", - }, - ) + try: + langfuse_client.create_dataset( + name=args.langfuse_dataset_name, + description=f"[{dataset_name_hash}] Synthetic data", + metadata={"name_hash": dataset_name_hash, "type": "synthetic_benchmark"}, + ) + except Exception as exc: + # We only continue if the dataset can be retrieved + try: + langfuse_client.get_dataset(args.langfuse_dataset_name) + print(f"Dataset {args.langfuse_dataset_name} already exists; continuing.") + except Exception as e: + raise exc from e # Run generation async semaphore = asyncio.Semaphore(args.max_concurrency) diff --git a/src/3_evals/README.md b/implementations/3_evals/README.md similarity index 100% rename from src/3_evals/README.md rename to implementations/3_evals/README.md diff --git a/src/3_evals/__init__.py b/implementations/3_evals/__init__.py similarity index 100% rename from src/3_evals/__init__.py rename to implementations/3_evals/__init__.py diff --git a/pyproject.toml b/pyproject.toml index 29671b9c..184bf1e0 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,28 +1,16 @@ [project] -name = "agent-bootcamp-202507" +name = "agent-bootcamp" version = "0.1.0" -description = "Vector Institute Agent Bootcamp 202507" +description = "Vector Institute Agent Bootcamp" readme = "README.md" authors = [ {name = "Vector AI Engineering", email = "ai_engineering@vectorinstitute.ai"}] -license = "Apache-2.0" +license = "MIT" requires-python = ">=3.12" dependencies = [ - "aiohttp>=3.12.14", - "beautifulsoup4>=4.13.4", - "datasets>=4.4.0", - "e2b-code-interpreter>=2.3.0", - "gradio>=6.1.0", - "langfuse>=3.9.0", - "lxml>=6.0.0", - "nest-asyncio>=1.6.0", + "aieng-agents>=0.1.0", "numpy<2.3.0", - "openai>=2.6.0", - "openai-agents>=0.4.0", "plotly>=6.2.0", - "pydantic>=2.11.7", - "pydantic-ai-slim[logfire]>=0.3.7", "scikit-learn>=1.7.0", - "weaviate-client>=4.15.4", ] [build-system] @@ -30,7 +18,7 @@ requires = ["hatchling"] build-backend = "hatchling.build" [tool.hatch.build.targets.wheel] -packages = ["src"] +packages = ["implementations"] [dependency-groups] dev = [ @@ -44,13 +32,11 @@ dev = [ "nbqa>=1.9.1", "pip-audit>=2.7.3", "pre-commit>=4.1.0", - "pymupdf>=1.26.7", "pytest>=8.3.4", "pytest-asyncio>=1.2.0", "pytest-cov>=7.0.0", "pytest-mock>=3.14.0", "ruff>=0.12.2", - "transformers>=4.54.1", ] docs = [ "jinja2>=3.1.6", # Pinning version to address vulnerability GHSA-cpwx-vrp4-4pq7 @@ -61,17 +47,19 @@ docs = [ "ipykernel>=6.29.5", "ipython>=9.4.0", ] -web-search = [ - "google-cloud-firestore>=2.21.0", - "fastapi[standard]>=0.116.1", - "google-genai>=1.46.0", - "simplejson>=3.20.2", -] # Default dependency groups to be installed [tool.uv] default-groups = ["dev", "docs"] +[tool.uv.workspace] +members = [ + "aieng-agents", +] + +[tool.uv.sources] +aieng-agents = { workspace = true } + [tool.ruff] include = ["*.py", "pyproject.toml", "*.ipynb"] line-length = 88 @@ -119,8 +107,6 @@ ignore = [ [tool.ruff.lint.per-file-ignores] "__init__.py" = ["E402", "F401", "F403", "F811", "D104"] - - [tool.ruff.lint.pep8-naming] ignore-names = ["X*", "setUp"] @@ -140,5 +126,5 @@ markers = [ [tool.coverage] [tool.coverage.run] - source=["aieng_template"] - omit=["tests/*", "*__init__.py"] + source=["aieng-agents/aieng"] + omit=["aieng-agents/aieng/tests/*", "tests/*", "*__init__.py"] diff --git a/src/2_frameworks/1_react_rag/cli.py b/src/2_frameworks/1_react_rag/cli.py deleted file mode 100644 index f755b7d0..00000000 --- a/src/2_frameworks/1_react_rag/cli.py +++ /dev/null @@ -1,73 +0,0 @@ -"""Non-Interactive Example of OpenAI Agent SDK for Knowledge Retrieval.""" - -import asyncio -import logging - -from agents import ( - Agent, - OpenAIChatCompletionsModel, - RunConfig, - Runner, - function_tool, -) -from dotenv import load_dotenv - -from src.prompts import REACT_INSTRUCTIONS -from src.utils import pretty_print -from src.utils.client_manager import AsyncClientManager - - -async def _main(query: str) -> None: - wikipedia_agent = Agent( - name="Wikipedia Agent", - instructions=REACT_INSTRUCTIONS, - tools=[function_tool(client_manager.knowledgebase.search_knowledgebase)], - model=OpenAIChatCompletionsModel( - model=client_manager.configs.default_worker_model, - openai_client=client_manager.openai_client, - ), - ) - - response = await Runner.run( - wikipedia_agent, - input=query, - run_config=no_tracing_config, - ) - - for item in response.new_items: - pretty_print(item.raw_item) - print() - - pretty_print(response.final_output) - - # Uncomment the following for a basic "streaming" example - - # from src.utils import oai_agent_stream_to_gradio_messages - # result_stream = Runner.run_streamed( - # wikipedia_agent, input=query, run_config=no_tracing_config - # ) - # async for event in result_stream.stream_events(): - # event_parsed = oai_agent_stream_to_gradio_messages(event) - # if len(event_parsed) > 0: - # pretty_print(event_parsed) - - -if __name__ == "__main__": - load_dotenv(verbose=True) - - logging.basicConfig(level=logging.INFO) - - no_tracing_config = RunConfig(tracing_disabled=True) - - # Initialize client manager - # This class initializes the OpenAI and Weaviate async clients, as well as the - # Weaviate knowledge base tool. The initialization is done once when the clients - # are first accessed, and the clients are reused for subsequent calls. - client_manager = AsyncClientManager() - - query = ( - "At which university did the SVP Software Engineering" - " at Apple (as of June 2025) earn their engineering degree?" - ) - - asyncio.run(_main(query)) diff --git a/src/prompts.py b/src/prompts.py deleted file mode 100644 index 7dc95d46..00000000 --- a/src/prompts.py +++ /dev/null @@ -1,12 +0,0 @@ -"""Centralized location for all system prompts.""" - -REACT_INSTRUCTIONS = """\ -Answer the question using the search tool. \ -EACH TIME before invoking the function, you must explain your reasons for doing so. \ -Be sure to mention the sources in your response. \ -If the search tool did not return intended results, try again. \ -For best performance, divide complex queries into simpler sub-queries. \ -Do not make up information. \ -For facts that might change over time, you must use the search tool to retrieve the \ -most up-to-date information. -""" diff --git a/src/utils/__init__.py b/src/utils/__init__.py deleted file mode 100644 index 3e541840..00000000 --- a/src/utils/__init__.py +++ /dev/null @@ -1,17 +0,0 @@ -"""Shared toolings for reference implementations.""" - -from .async_utils import gather_with_progress, rate_limited -from .client_manager import AsyncClientManager -from .data.batching import create_batches -from .env_vars import Configs -from .gradio.messages import ( - gradio_messages_to_oai_chat, - oai_agent_items_to_gradio_messages, - oai_agent_stream_to_gradio_messages, -) -from .langfuse.oai_sdk_setup import setup_langfuse_tracer -from .logging import set_up_logging -from .pretty_printing import pretty_print -from .tools.code_interpreter import CodeInterpreter -from .tools.kb_weaviate import AsyncWeaviateKnowledgeBase, get_weaviate_async_client -from .trees import tree_filter diff --git a/src/utils/data/__init__.py b/src/utils/data/__init__.py deleted file mode 100644 index 99eee296..00000000 --- a/src/utils/data/__init__.py +++ /dev/null @@ -1,4 +0,0 @@ -from .load_dataset import get_dataset, get_dataset_url_hash - - -__all__ = ["get_dataset", "get_dataset_url_hash"] diff --git a/src/utils/gradio/__init__.py b/src/utils/gradio/__init__.py deleted file mode 100644 index f89efb6c..00000000 --- a/src/utils/gradio/__init__.py +++ /dev/null @@ -1,10 +0,0 @@ -import gradio as gr - - -COMMON_GRADIO_CONFIG = { - "chatbot": gr.Chatbot(height=600), - "textbox": gr.Textbox(lines=1, placeholder="Enter your prompt"), - # Additional input to maintain session state across multiple turns - # NOTE: Examples must be a list of lists when additional inputs are provided - "additional_inputs": gr.State(value={}, render=False), -} diff --git a/src/utils/langfuse/trace_id.py b/src/utils/langfuse/trace_id.py deleted file mode 100644 index e1b93fa4..00000000 --- a/src/utils/langfuse/trace_id.py +++ /dev/null @@ -1,11 +0,0 @@ -""" -Obtain trace_id, required for linking trace to dataset row. - -Full documentation: -langfuse.com/docs/integrations/openaiagentssdk/example-evaluating-openai-agents -running-the-agent-on-the-dataset -""" - - -def get_langfuse_trace_id(): - """Obtain "formatted" trace_id for LangFuse.""" diff --git a/src/utils/tools/__init__.py b/src/utils/tools/__init__.py deleted file mode 100644 index b1514d1c..00000000 --- a/src/utils/tools/__init__.py +++ /dev/null @@ -1,2 +0,0 @@ -from .kb_weaviate import AsyncWeaviateKnowledgeBase, get_weaviate_async_client -from .news_events import get_news_events diff --git a/src/utils/trees.py b/src/utils/trees.py deleted file mode 100644 index e5dade0d..00000000 --- a/src/utils/trees.py +++ /dev/null @@ -1,24 +0,0 @@ -"""Utils for handling nested dict.""" - -from typing import Any, Callable, TypeVar - - -Tree = TypeVar("Tree", bound=dict) - - -def tree_filter( - data: Tree, - criteria_fn: Callable[[Any], bool] = lambda x: x is not None, -) -> Tree: - """Keep only leaves for which criteria is True. - - Filters out None leaves if criteria is not specified. - """ - output: Tree = {} # type: ignore[reportAssignType] - for k, v in data.items(): - if isinstance(v, dict): - output[k] = tree_filter(v, criteria_fn=criteria_fn) - elif criteria_fn(v): - output[k] = v - - return output diff --git a/tests/README.md b/tests/README.md deleted file mode 100644 index 98f737e8..00000000 --- a/tests/README.md +++ /dev/null @@ -1,7 +0,0 @@ -# Unit tests - -```bash -uv run pytest -sv tests/tool_tests/test_weaviate.py -uv run pytest -sv tests/tool_tests/test_code_interpreter.py -uv run pytest -sv tests/tool_tests/test_integration.py -``` diff --git a/tests/tool_tests/test_integration.py b/tests/tool_tests/test_integration.py index 847c8f3e..167556e3 100644 --- a/tests/tool_tests/test_integration.py +++ b/tests/tool_tests/test_integration.py @@ -5,18 +5,16 @@ import pytest import pytest_asyncio -from dotenv import load_dotenv -from langfuse import get_client -from openai import AsyncOpenAI - -from src.utils import ( +from aieng.agents import Configs, pretty_print +from aieng.agents.langfuse import set_up_langfuse_otlp_env_vars +from aieng.agents.tools import ( AsyncWeaviateKnowledgeBase, - Configs, + GeminiGroundingWithGoogleSearch, get_weaviate_async_client, - pretty_print, ) -from src.utils.langfuse.otlp_env_setup import set_up_langfuse_otlp_env_vars -from src.utils.tools.gemini_grounding import GeminiGroundingWithGoogleSearch +from dotenv import load_dotenv +from langfuse import get_client +from openai import AsyncOpenAI load_dotenv(verbose=True) diff --git a/uv.lock b/uv.lock index 605b5631..f8d80a53 100644 --- a/uv.lock +++ b/uv.lock @@ -7,27 +7,21 @@ resolution-markers = [ "python_full_version < '3.13'", ] +[manifest] +members = [ + "agent-bootcamp", + "aieng-agents", +] + [[package]] -name = "agent-bootcamp-202507" +name = "agent-bootcamp" version = "0.1.0" source = { editable = "." } dependencies = [ - { name = "aiohttp" }, - { name = "beautifulsoup4" }, - { name = "datasets" }, - { name = "e2b-code-interpreter" }, - { name = "gradio" }, - { name = "langfuse" }, - { name = "lxml" }, - { name = "nest-asyncio" }, + { name = "aieng-agents" }, { name = "numpy" }, - { name = "openai" }, - { name = "openai-agents" }, { name = "plotly" }, - { name = "pydantic" }, - { name = "pydantic-ai-slim", extra = ["logfire"] }, { name = "scikit-learn" }, - { name = "weaviate-client" }, ] [package.dev-dependencies] @@ -42,13 +36,11 @@ dev = [ { name = "nbqa" }, { name = "pip-audit" }, { name = "pre-commit" }, - { name = "pymupdf" }, { name = "pytest" }, { name = "pytest-asyncio" }, { name = "pytest-cov" }, { name = "pytest-mock" }, { name = "ruff" }, - { name = "transformers" }, ] docs = [ { name = "ipykernel" }, @@ -59,31 +51,13 @@ docs = [ { name = "mkdocstrings" }, { name = "mkdocstrings-python" }, ] -web-search = [ - { name = "fastapi", extra = ["standard"] }, - { name = "google-cloud-firestore" }, - { name = "google-genai" }, - { name = "simplejson" }, -] [package.metadata] requires-dist = [ - { name = "aiohttp", specifier = ">=3.12.14" }, - { name = "beautifulsoup4", specifier = ">=4.13.4" }, - { name = "datasets", specifier = ">=4.4.0" }, - { name = "e2b-code-interpreter", specifier = ">=2.3.0" }, - { name = "gradio", specifier = ">=6.1.0" }, - { name = "langfuse", specifier = ">=3.9.0" }, - { name = "lxml", specifier = ">=6.0.0" }, - { name = "nest-asyncio", specifier = ">=1.6.0" }, + { name = "aieng-agents", editable = "aieng-agents" }, { name = "numpy", specifier = "<2.3.0" }, - { name = "openai", specifier = ">=2.6.0" }, - { name = "openai-agents", specifier = ">=0.4.0" }, { name = "plotly", specifier = ">=6.2.0" }, - { name = "pydantic", specifier = ">=2.11.7" }, - { name = "pydantic-ai-slim", extras = ["logfire"], specifier = ">=0.3.7" }, { name = "scikit-learn", specifier = ">=1.7.0" }, - { name = "weaviate-client", specifier = ">=4.15.4" }, ] [package.metadata.requires-dev] @@ -98,13 +72,11 @@ dev = [ { name = "nbqa", specifier = ">=1.9.1" }, { name = "pip-audit", specifier = ">=2.7.3" }, { name = "pre-commit", specifier = ">=4.1.0" }, - { name = "pymupdf", specifier = ">=1.26.7" }, { name = "pytest", specifier = ">=8.3.4" }, { name = "pytest-asyncio", specifier = ">=1.2.0" }, { name = "pytest-cov", specifier = ">=7.0.0" }, { name = "pytest-mock", specifier = ">=3.14.0" }, { name = "ruff", specifier = ">=0.12.2" }, - { name = "transformers", specifier = ">=4.54.1" }, ] docs = [ { name = "ipykernel", specifier = ">=6.29.5" }, @@ -115,11 +87,74 @@ docs = [ { name = "mkdocstrings", specifier = ">=0.24.1" }, { name = "mkdocstrings-python", specifier = ">=1.16.12" }, ] -web-search = [ + +[[package]] +name = "aieng-agents" +version = "0.1.0" +source = { editable = "aieng-agents" } +dependencies = [ + { name = "backoff" }, + { name = "beautifulsoup4" }, + { name = "click" }, + { name = "datasets" }, + { name = "e2b-code-interpreter" }, + { name = "fastapi", extra = ["standard"] }, + { name = "google-cloud-firestore" }, + { name = "google-genai" }, + { name = "gradio" }, + { name = "httpx" }, + { name = "langfuse" }, + { name = "lxml" }, + { name = "nest-asyncio" }, + { name = "openai" }, + { name = "openai-agents" }, + { name = "pandas" }, + { name = "pillow" }, + { name = "pydantic" }, + { name = "pydantic-ai-slim", extra = ["logfire"] }, + { name = "pymupdf" }, + { name = "simplejson" }, + { name = "transformers" }, + { name = "weaviate-client" }, +] + +[package.dev-dependencies] +dev = [ + { name = "pytest" }, + { name = "pytest-asyncio" }, +] + +[package.metadata] +requires-dist = [ + { name = "backoff", specifier = ">=2.2.1" }, + { name = "beautifulsoup4", specifier = ">=4.13.4" }, + { name = "click", specifier = ">=8.3.0" }, + { name = "datasets", specifier = ">=4.4.0" }, + { name = "e2b-code-interpreter", specifier = ">=2.3.0" }, { name = "fastapi", extras = ["standard"], specifier = ">=0.116.1" }, { name = "google-cloud-firestore", specifier = ">=2.21.0" }, { name = "google-genai", specifier = ">=1.46.0" }, + { name = "gradio", specifier = ">=6.7.0" }, + { name = "httpx", specifier = ">=0.28.1" }, + { name = "langfuse", specifier = ">=3.9.0" }, + { name = "lxml", specifier = ">=6.0.0" }, + { name = "nest-asyncio", specifier = ">=1.6.0" }, + { name = "openai", specifier = ">=2.6.0" }, + { name = "openai-agents", specifier = ">=0.4.0" }, + { name = "pandas", specifier = ">=2.3.3" }, + { name = "pillow", specifier = ">=12.1.1" }, + { name = "pydantic", specifier = ">=2.11.7" }, + { name = "pydantic-ai-slim", extras = ["logfire"], specifier = ">=0.3.7" }, + { name = "pymupdf", specifier = ">=1.26.7" }, { name = "simplejson", specifier = ">=3.20.2" }, + { name = "transformers", specifier = ">=4.54.1" }, + { name = "weaviate-client", specifier = ">=4.15.4" }, +] + +[package.metadata.requires-dev] +dev = [ + { name = "pytest", specifier = ">=8.3.4" }, + { name = "pytest-asyncio", specifier = ">=1.2.0" }, ] [[package]] @@ -441,14 +476,14 @@ wheels = [ [[package]] name = "authlib" -version = "1.6.6" +version = "1.6.9" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "cryptography" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/bb/9b/b1661026ff24bc641b76b78c5222d614776b0c085bcfdac9bd15a1cb4b35/authlib-1.6.6.tar.gz", hash = "sha256:45770e8e056d0f283451d9996fbb59b70d45722b45d854d58f32878d0a40c38e", size = 164894, upload-time = "2025-12-12T08:01:41.464Z" } +sdist = { url = "https://files.pythonhosted.org/packages/af/98/00d3dd826d46959ad8e32af2dbb2398868fd9fd0683c26e56d0789bd0e68/authlib-1.6.9.tar.gz", hash = "sha256:d8f2421e7e5980cc1ddb4e32d3f5fa659cfaf60d8eaf3281ebed192e4ab74f04", size = 165134, upload-time = "2026-03-02T07:44:01.998Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/54/51/321e821856452f7386c4e9df866f196720b1ad0c5ea1623ea7399969ae3b/authlib-1.6.6-py2.py3-none-any.whl", hash = "sha256:7d9e9bc535c13974313a87f53e8430eb6ea3d1cf6ae4f6efcd793f2e949143fd", size = 244005, upload-time = "2025-12-12T08:01:40.209Z" }, + { url = "https://files.pythonhosted.org/packages/53/23/b65f568ed0c22f1efacb744d2db1a33c8068f384b8c9b482b52ebdbc3ef6/authlib-1.6.9-py2.py3-none-any.whl", hash = "sha256:f08b4c14e08f0861dc18a32357b33fbcfd2ea86cfe3fe149484b4d764c4a0ac3", size = 244197, upload-time = "2026-03-02T07:44:00.307Z" }, ] [[package]] @@ -859,58 +894,55 @@ wheels = [ [[package]] name = "cryptography" -version = "46.0.3" +version = "46.0.5" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "cffi", marker = "platform_python_implementation != 'PyPy'" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/9f/33/c00162f49c0e2fe8064a62cb92b93e50c74a72bc370ab92f86112b33ff62/cryptography-46.0.3.tar.gz", hash = "sha256:a8b17438104fed022ce745b362294d9ce35b4c2e45c1d958ad4a4b019285f4a1", size = 749258, upload-time = "2025-10-15T23:18:31.74Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/1d/42/9c391dd801d6cf0d561b5890549d4b27bafcc53b39c31a817e69d87c625b/cryptography-46.0.3-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:109d4ddfadf17e8e7779c39f9b18111a09efb969a301a31e987416a0191ed93a", size = 7225004, upload-time = "2025-10-15T23:16:52.239Z" }, - { url = "https://files.pythonhosted.org/packages/1c/67/38769ca6b65f07461eb200e85fc1639b438bdc667be02cf7f2cd6a64601c/cryptography-46.0.3-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:09859af8466b69bc3c27bdf4f5d84a665e0f7ab5088412e9e2ec49758eca5cbc", size = 4296667, upload-time = "2025-10-15T23:16:54.369Z" }, - { url = "https://files.pythonhosted.org/packages/5c/49/498c86566a1d80e978b42f0d702795f69887005548c041636df6ae1ca64c/cryptography-46.0.3-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:01ca9ff2885f3acc98c29f1860552e37f6d7c7d013d7334ff2a9de43a449315d", size = 4450807, upload-time = "2025-10-15T23:16:56.414Z" }, - { url = "https://files.pythonhosted.org/packages/4b/0a/863a3604112174c8624a2ac3c038662d9e59970c7f926acdcfaed8d61142/cryptography-46.0.3-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:6eae65d4c3d33da080cff9c4ab1f711b15c1d9760809dad6ea763f3812d254cb", size = 4299615, upload-time = "2025-10-15T23:16:58.442Z" }, - { url = "https://files.pythonhosted.org/packages/64/02/b73a533f6b64a69f3cd3872acb6ebc12aef924d8d103133bb3ea750dc703/cryptography-46.0.3-cp311-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:e5bf0ed4490068a2e72ac03d786693adeb909981cc596425d09032d372bcc849", size = 4016800, upload-time = "2025-10-15T23:17:00.378Z" }, - { url = "https://files.pythonhosted.org/packages/25/d5/16e41afbfa450cde85a3b7ec599bebefaef16b5c6ba4ec49a3532336ed72/cryptography-46.0.3-cp311-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:5ecfccd2329e37e9b7112a888e76d9feca2347f12f37918facbb893d7bb88ee8", size = 4984707, upload-time = "2025-10-15T23:17:01.98Z" }, - { url = "https://files.pythonhosted.org/packages/c9/56/e7e69b427c3878352c2fb9b450bd0e19ed552753491d39d7d0a2f5226d41/cryptography-46.0.3-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:a2c0cd47381a3229c403062f764160d57d4d175e022c1df84e168c6251a22eec", size = 4482541, upload-time = "2025-10-15T23:17:04.078Z" }, - { url = "https://files.pythonhosted.org/packages/78/f6/50736d40d97e8483172f1bb6e698895b92a223dba513b0ca6f06b2365339/cryptography-46.0.3-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:549e234ff32571b1f4076ac269fcce7a808d3bf98b76c8dd560e42dbc66d7d91", size = 4299464, upload-time = "2025-10-15T23:17:05.483Z" }, - { url = "https://files.pythonhosted.org/packages/00/de/d8e26b1a855f19d9994a19c702fa2e93b0456beccbcfe437eda00e0701f2/cryptography-46.0.3-cp311-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:c0a7bb1a68a5d3471880e264621346c48665b3bf1c3759d682fc0864c540bd9e", size = 4950838, upload-time = "2025-10-15T23:17:07.425Z" }, - { url = "https://files.pythonhosted.org/packages/8f/29/798fc4ec461a1c9e9f735f2fc58741b0daae30688f41b2497dcbc9ed1355/cryptography-46.0.3-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:10b01676fc208c3e6feeb25a8b83d81767e8059e1fe86e1dc62d10a3018fa926", size = 4481596, upload-time = "2025-10-15T23:17:09.343Z" }, - { url = "https://files.pythonhosted.org/packages/15/8d/03cd48b20a573adfff7652b76271078e3045b9f49387920e7f1f631d125e/cryptography-46.0.3-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:0abf1ffd6e57c67e92af68330d05760b7b7efb243aab8377e583284dbab72c71", size = 4426782, upload-time = "2025-10-15T23:17:11.22Z" }, - { url = "https://files.pythonhosted.org/packages/fa/b1/ebacbfe53317d55cf33165bda24c86523497a6881f339f9aae5c2e13e57b/cryptography-46.0.3-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:a04bee9ab6a4da801eb9b51f1b708a1b5b5c9eb48c03f74198464c66f0d344ac", size = 4698381, upload-time = "2025-10-15T23:17:12.829Z" }, - { url = "https://files.pythonhosted.org/packages/96/92/8a6a9525893325fc057a01f654d7efc2c64b9de90413adcf605a85744ff4/cryptography-46.0.3-cp311-abi3-win32.whl", hash = "sha256:f260d0d41e9b4da1ed1e0f1ce571f97fe370b152ab18778e9e8f67d6af432018", size = 3055988, upload-time = "2025-10-15T23:17:14.65Z" }, - { url = "https://files.pythonhosted.org/packages/7e/bf/80fbf45253ea585a1e492a6a17efcb93467701fa79e71550a430c5e60df0/cryptography-46.0.3-cp311-abi3-win_amd64.whl", hash = "sha256:a9a3008438615669153eb86b26b61e09993921ebdd75385ddd748702c5adfddb", size = 3514451, upload-time = "2025-10-15T23:17:16.142Z" }, - { url = "https://files.pythonhosted.org/packages/2e/af/9b302da4c87b0beb9db4e756386a7c6c5b8003cd0e742277888d352ae91d/cryptography-46.0.3-cp311-abi3-win_arm64.whl", hash = "sha256:5d7f93296ee28f68447397bf5198428c9aeeab45705a55d53a6343455dcb2c3c", size = 2928007, upload-time = "2025-10-15T23:17:18.04Z" }, - { url = "https://files.pythonhosted.org/packages/f5/e2/a510aa736755bffa9d2f75029c229111a1d02f8ecd5de03078f4c18d91a3/cryptography-46.0.3-cp314-cp314t-macosx_10_9_universal2.whl", hash = "sha256:00a5e7e87938e5ff9ff5447ab086a5706a957137e6e433841e9d24f38a065217", size = 7158012, upload-time = "2025-10-15T23:17:19.982Z" }, - { url = "https://files.pythonhosted.org/packages/73/dc/9aa866fbdbb95b02e7f9d086f1fccfeebf8953509b87e3f28fff927ff8a0/cryptography-46.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:c8daeb2d2174beb4575b77482320303f3d39b8e81153da4f0fb08eb5fe86a6c5", size = 4288728, upload-time = "2025-10-15T23:17:21.527Z" }, - { url = "https://files.pythonhosted.org/packages/c5/fd/bc1daf8230eaa075184cbbf5f8cd00ba9db4fd32d63fb83da4671b72ed8a/cryptography-46.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:39b6755623145ad5eff1dab323f4eae2a32a77a7abef2c5089a04a3d04366715", size = 4435078, upload-time = "2025-10-15T23:17:23.042Z" }, - { url = "https://files.pythonhosted.org/packages/82/98/d3bd5407ce4c60017f8ff9e63ffee4200ab3e23fe05b765cab805a7db008/cryptography-46.0.3-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:db391fa7c66df6762ee3f00c95a89e6d428f4d60e7abc8328f4fe155b5ac6e54", size = 4293460, upload-time = "2025-10-15T23:17:24.885Z" }, - { url = "https://files.pythonhosted.org/packages/26/e9/e23e7900983c2b8af7a08098db406cf989d7f09caea7897e347598d4cd5b/cryptography-46.0.3-cp314-cp314t-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:78a97cf6a8839a48c49271cdcbd5cf37ca2c1d6b7fdd86cc864f302b5e9bf459", size = 3995237, upload-time = "2025-10-15T23:17:26.449Z" }, - { url = "https://files.pythonhosted.org/packages/91/15/af68c509d4a138cfe299d0d7ddb14afba15233223ebd933b4bbdbc7155d3/cryptography-46.0.3-cp314-cp314t-manylinux_2_28_ppc64le.whl", hash = "sha256:dfb781ff7eaa91a6f7fd41776ec37c5853c795d3b358d4896fdbb5df168af422", size = 4967344, upload-time = "2025-10-15T23:17:28.06Z" }, - { url = "https://files.pythonhosted.org/packages/ca/e3/8643d077c53868b681af077edf6b3cb58288b5423610f21c62aadcbe99f4/cryptography-46.0.3-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:6f61efb26e76c45c4a227835ddeae96d83624fb0d29eb5df5b96e14ed1a0afb7", size = 4466564, upload-time = "2025-10-15T23:17:29.665Z" }, - { url = "https://files.pythonhosted.org/packages/0e/43/c1e8726fa59c236ff477ff2b5dc071e54b21e5a1e51aa2cee1676f1c986f/cryptography-46.0.3-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:23b1a8f26e43f47ceb6d6a43115f33a5a37d57df4ea0ca295b780ae8546e8044", size = 4292415, upload-time = "2025-10-15T23:17:31.686Z" }, - { url = "https://files.pythonhosted.org/packages/42/f9/2f8fefdb1aee8a8e3256a0568cffc4e6d517b256a2fe97a029b3f1b9fe7e/cryptography-46.0.3-cp314-cp314t-manylinux_2_34_ppc64le.whl", hash = "sha256:b419ae593c86b87014b9be7396b385491ad7f320bde96826d0dd174459e54665", size = 4931457, upload-time = "2025-10-15T23:17:33.478Z" }, - { url = "https://files.pythonhosted.org/packages/79/30/9b54127a9a778ccd6d27c3da7563e9f2d341826075ceab89ae3b41bf5be2/cryptography-46.0.3-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:50fc3343ac490c6b08c0cf0d704e881d0d660be923fd3076db3e932007e726e3", size = 4466074, upload-time = "2025-10-15T23:17:35.158Z" }, - { url = "https://files.pythonhosted.org/packages/ac/68/b4f4a10928e26c941b1b6a179143af9f4d27d88fe84a6a3c53592d2e76bf/cryptography-46.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:22d7e97932f511d6b0b04f2bfd818d73dcd5928db509460aaf48384778eb6d20", size = 4420569, upload-time = "2025-10-15T23:17:37.188Z" }, - { url = "https://files.pythonhosted.org/packages/a3/49/3746dab4c0d1979888f125226357d3262a6dd40e114ac29e3d2abdf1ec55/cryptography-46.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:d55f3dffadd674514ad19451161118fd010988540cee43d8bc20675e775925de", size = 4681941, upload-time = "2025-10-15T23:17:39.236Z" }, - { url = "https://files.pythonhosted.org/packages/fd/30/27654c1dbaf7e4a3531fa1fc77986d04aefa4d6d78259a62c9dc13d7ad36/cryptography-46.0.3-cp314-cp314t-win32.whl", hash = "sha256:8a6e050cb6164d3f830453754094c086ff2d0b2f3a897a1d9820f6139a1f0914", size = 3022339, upload-time = "2025-10-15T23:17:40.888Z" }, - { url = "https://files.pythonhosted.org/packages/f6/30/640f34ccd4d2a1bc88367b54b926b781b5a018d65f404d409aba76a84b1c/cryptography-46.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:760f83faa07f8b64e9c33fc963d790a2edb24efb479e3520c14a45741cd9b2db", size = 3494315, upload-time = "2025-10-15T23:17:42.769Z" }, - { url = "https://files.pythonhosted.org/packages/ba/8b/88cc7e3bd0a8e7b861f26981f7b820e1f46aa9d26cc482d0feba0ecb4919/cryptography-46.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:516ea134e703e9fe26bcd1277a4b59ad30586ea90c365a87781d7887a646fe21", size = 2919331, upload-time = "2025-10-15T23:17:44.468Z" }, - { url = "https://files.pythonhosted.org/packages/fd/23/45fe7f376a7df8daf6da3556603b36f53475a99ce4faacb6ba2cf3d82021/cryptography-46.0.3-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:cb3d760a6117f621261d662bccc8ef5bc32ca673e037c83fbe565324f5c46936", size = 7218248, upload-time = "2025-10-15T23:17:46.294Z" }, - { url = "https://files.pythonhosted.org/packages/27/32/b68d27471372737054cbd34c84981f9edbc24fe67ca225d389799614e27f/cryptography-46.0.3-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:4b7387121ac7d15e550f5cb4a43aef2559ed759c35df7336c402bb8275ac9683", size = 4294089, upload-time = "2025-10-15T23:17:48.269Z" }, - { url = "https://files.pythonhosted.org/packages/26/42/fa8389d4478368743e24e61eea78846a0006caffaf72ea24a15159215a14/cryptography-46.0.3-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:15ab9b093e8f09daab0f2159bb7e47532596075139dd74365da52ecc9cb46c5d", size = 4440029, upload-time = "2025-10-15T23:17:49.837Z" }, - { url = "https://files.pythonhosted.org/packages/5f/eb/f483db0ec5ac040824f269e93dd2bd8a21ecd1027e77ad7bdf6914f2fd80/cryptography-46.0.3-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:46acf53b40ea38f9c6c229599a4a13f0d46a6c3fa9ef19fc1a124d62e338dfa0", size = 4297222, upload-time = "2025-10-15T23:17:51.357Z" }, - { url = "https://files.pythonhosted.org/packages/fd/cf/da9502c4e1912cb1da3807ea3618a6829bee8207456fbbeebc361ec38ba3/cryptography-46.0.3-cp38-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:10ca84c4668d066a9878890047f03546f3ae0a6b8b39b697457b7757aaf18dbc", size = 4012280, upload-time = "2025-10-15T23:17:52.964Z" }, - { url = "https://files.pythonhosted.org/packages/6b/8f/9adb86b93330e0df8b3dcf03eae67c33ba89958fc2e03862ef1ac2b42465/cryptography-46.0.3-cp38-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:36e627112085bb3b81b19fed209c05ce2a52ee8b15d161b7c643a7d5a88491f3", size = 4978958, upload-time = "2025-10-15T23:17:54.965Z" }, - { url = "https://files.pythonhosted.org/packages/d1/a0/5fa77988289c34bdb9f913f5606ecc9ada1adb5ae870bd0d1054a7021cc4/cryptography-46.0.3-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:1000713389b75c449a6e979ffc7dcc8ac90b437048766cef052d4d30b8220971", size = 4473714, upload-time = "2025-10-15T23:17:56.754Z" }, - { url = "https://files.pythonhosted.org/packages/14/e5/fc82d72a58d41c393697aa18c9abe5ae1214ff6f2a5c18ac470f92777895/cryptography-46.0.3-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:b02cf04496f6576afffef5ddd04a0cb7d49cf6be16a9059d793a30b035f6b6ac", size = 4296970, upload-time = "2025-10-15T23:17:58.588Z" }, - { url = "https://files.pythonhosted.org/packages/78/06/5663ed35438d0b09056973994f1aec467492b33bd31da36e468b01ec1097/cryptography-46.0.3-cp38-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:71e842ec9bc7abf543b47cf86b9a743baa95f4677d22baa4c7d5c69e49e9bc04", size = 4940236, upload-time = "2025-10-15T23:18:00.897Z" }, - { url = "https://files.pythonhosted.org/packages/fc/59/873633f3f2dcd8a053b8dd1d38f783043b5fce589c0f6988bf55ef57e43e/cryptography-46.0.3-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:402b58fc32614f00980b66d6e56a5b4118e6cb362ae8f3fda141ba4689bd4506", size = 4472642, upload-time = "2025-10-15T23:18:02.749Z" }, - { url = "https://files.pythonhosted.org/packages/3d/39/8e71f3930e40f6877737d6f69248cf74d4e34b886a3967d32f919cc50d3b/cryptography-46.0.3-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:ef639cb3372f69ec44915fafcd6698b6cc78fbe0c2ea41be867f6ed612811963", size = 4423126, upload-time = "2025-10-15T23:18:04.85Z" }, - { url = "https://files.pythonhosted.org/packages/cd/c7/f65027c2810e14c3e7268353b1681932b87e5a48e65505d8cc17c99e36ae/cryptography-46.0.3-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:3b51b8ca4f1c6453d8829e1eb7299499ca7f313900dd4d89a24b8b87c0a780d4", size = 4686573, upload-time = "2025-10-15T23:18:06.908Z" }, - { url = "https://files.pythonhosted.org/packages/0a/6e/1c8331ddf91ca4730ab3086a0f1be19c65510a33b5a441cb334e7a2d2560/cryptography-46.0.3-cp38-abi3-win32.whl", hash = "sha256:6276eb85ef938dc035d59b87c8a7dc559a232f954962520137529d77b18ff1df", size = 3036695, upload-time = "2025-10-15T23:18:08.672Z" }, - { url = "https://files.pythonhosted.org/packages/90/45/b0d691df20633eff80955a0fc7695ff9051ffce8b69741444bd9ed7bd0db/cryptography-46.0.3-cp38-abi3-win_amd64.whl", hash = "sha256:416260257577718c05135c55958b674000baef9a1c7d9e8f306ec60d71db850f", size = 3501720, upload-time = "2025-10-15T23:18:10.632Z" }, - { url = "https://files.pythonhosted.org/packages/e8/cb/2da4cc83f5edb9c3257d09e1e7ab7b23f049c7962cae8d842bbef0a9cec9/cryptography-46.0.3-cp38-abi3-win_arm64.whl", hash = "sha256:d89c3468de4cdc4f08a57e214384d0471911a3830fcdaf7a8cc587e42a866372", size = 2918740, upload-time = "2025-10-15T23:18:12.277Z" }, +sdist = { url = "https://files.pythonhosted.org/packages/60/04/ee2a9e8542e4fa2773b81771ff8349ff19cdd56b7258a0cc442639052edb/cryptography-46.0.5.tar.gz", hash = "sha256:abace499247268e3757271b2f1e244b36b06f8515cf27c4d49468fc9eb16e93d", size = 750064, upload-time = "2026-02-10T19:18:38.255Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/f7/81/b0bb27f2ba931a65409c6b8a8b358a7f03c0e46eceacddff55f7c84b1f3b/cryptography-46.0.5-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:351695ada9ea9618b3500b490ad54c739860883df6c1f555e088eaf25b1bbaad", size = 7176289, upload-time = "2026-02-10T19:17:08.274Z" }, + { url = "https://files.pythonhosted.org/packages/ff/9e/6b4397a3e3d15123de3b1806ef342522393d50736c13b20ec4c9ea6693a6/cryptography-46.0.5-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:c18ff11e86df2e28854939acde2d003f7984f721eba450b56a200ad90eeb0e6b", size = 4275637, upload-time = "2026-02-10T19:17:10.53Z" }, + { url = "https://files.pythonhosted.org/packages/63/e7/471ab61099a3920b0c77852ea3f0ea611c9702f651600397ac567848b897/cryptography-46.0.5-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:4d7e3d356b8cd4ea5aff04f129d5f66ebdc7b6f8eae802b93739ed520c47c79b", size = 4424742, upload-time = "2026-02-10T19:17:12.388Z" }, + { url = "https://files.pythonhosted.org/packages/37/53/a18500f270342d66bf7e4d9f091114e31e5ee9e7375a5aba2e85a91e0044/cryptography-46.0.5-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:50bfb6925eff619c9c023b967d5b77a54e04256c4281b0e21336a130cd7fc263", size = 4277528, upload-time = "2026-02-10T19:17:13.853Z" }, + { url = "https://files.pythonhosted.org/packages/22/29/c2e812ebc38c57b40e7c583895e73c8c5adb4d1e4a0cc4c5a4fdab2b1acc/cryptography-46.0.5-cp311-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:803812e111e75d1aa73690d2facc295eaefd4439be1023fefc4995eaea2af90d", size = 4947993, upload-time = "2026-02-10T19:17:15.618Z" }, + { url = "https://files.pythonhosted.org/packages/6b/e7/237155ae19a9023de7e30ec64e5d99a9431a567407ac21170a046d22a5a3/cryptography-46.0.5-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:3ee190460e2fbe447175cda91b88b84ae8322a104fc27766ad09428754a618ed", size = 4456855, upload-time = "2026-02-10T19:17:17.221Z" }, + { url = "https://files.pythonhosted.org/packages/2d/87/fc628a7ad85b81206738abbd213b07702bcbdada1dd43f72236ef3cffbb5/cryptography-46.0.5-cp311-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:f145bba11b878005c496e93e257c1e88f154d278d2638e6450d17e0f31e558d2", size = 3984635, upload-time = "2026-02-10T19:17:18.792Z" }, + { url = "https://files.pythonhosted.org/packages/84/29/65b55622bde135aedf4565dc509d99b560ee4095e56989e815f8fd2aa910/cryptography-46.0.5-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:e9251e3be159d1020c4030bd2e5f84d6a43fe54b6c19c12f51cde9542a2817b2", size = 4277038, upload-time = "2026-02-10T19:17:20.256Z" }, + { url = "https://files.pythonhosted.org/packages/bc/36/45e76c68d7311432741faf1fbf7fac8a196a0a735ca21f504c75d37e2558/cryptography-46.0.5-cp311-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:47fb8a66058b80e509c47118ef8a75d14c455e81ac369050f20ba0d23e77fee0", size = 4912181, upload-time = "2026-02-10T19:17:21.825Z" }, + { url = "https://files.pythonhosted.org/packages/6d/1a/c1ba8fead184d6e3d5afcf03d569acac5ad063f3ac9fb7258af158f7e378/cryptography-46.0.5-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:4c3341037c136030cb46e4b1e17b7418ea4cbd9dd207e4a6f3b2b24e0d4ac731", size = 4456482, upload-time = "2026-02-10T19:17:25.133Z" }, + { url = "https://files.pythonhosted.org/packages/f9/e5/3fb22e37f66827ced3b902cf895e6a6bc1d095b5b26be26bd13c441fdf19/cryptography-46.0.5-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:890bcb4abd5a2d3f852196437129eb3667d62630333aacc13dfd470fad3aaa82", size = 4405497, upload-time = "2026-02-10T19:17:26.66Z" }, + { url = "https://files.pythonhosted.org/packages/1a/df/9d58bb32b1121a8a2f27383fabae4d63080c7ca60b9b5c88be742be04ee7/cryptography-46.0.5-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:80a8d7bfdf38f87ca30a5391c0c9ce4ed2926918e017c29ddf643d0ed2778ea1", size = 4667819, upload-time = "2026-02-10T19:17:28.569Z" }, + { url = "https://files.pythonhosted.org/packages/ea/ed/325d2a490c5e94038cdb0117da9397ece1f11201f425c4e9c57fe5b9f08b/cryptography-46.0.5-cp311-abi3-win32.whl", hash = "sha256:60ee7e19e95104d4c03871d7d7dfb3d22ef8a9b9c6778c94e1c8fcc8365afd48", size = 3028230, upload-time = "2026-02-10T19:17:30.518Z" }, + { url = "https://files.pythonhosted.org/packages/e9/5a/ac0f49e48063ab4255d9e3b79f5def51697fce1a95ea1370f03dc9db76f6/cryptography-46.0.5-cp311-abi3-win_amd64.whl", hash = "sha256:38946c54b16c885c72c4f59846be9743d699eee2b69b6988e0a00a01f46a61a4", size = 3480909, upload-time = "2026-02-10T19:17:32.083Z" }, + { url = "https://files.pythonhosted.org/packages/00/13/3d278bfa7a15a96b9dc22db5a12ad1e48a9eb3d40e1827ef66a5df75d0d0/cryptography-46.0.5-cp314-cp314t-macosx_10_9_universal2.whl", hash = "sha256:94a76daa32eb78d61339aff7952ea819b1734b46f73646a07decb40e5b3448e2", size = 7119287, upload-time = "2026-02-10T19:17:33.801Z" }, + { url = "https://files.pythonhosted.org/packages/67/c8/581a6702e14f0898a0848105cbefd20c058099e2c2d22ef4e476dfec75d7/cryptography-46.0.5-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:5be7bf2fb40769e05739dd0046e7b26f9d4670badc7b032d6ce4db64dddc0678", size = 4265728, upload-time = "2026-02-10T19:17:35.569Z" }, + { url = "https://files.pythonhosted.org/packages/dd/4a/ba1a65ce8fc65435e5a849558379896c957870dd64fecea97b1ad5f46a37/cryptography-46.0.5-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:fe346b143ff9685e40192a4960938545c699054ba11d4f9029f94751e3f71d87", size = 4408287, upload-time = "2026-02-10T19:17:36.938Z" }, + { url = "https://files.pythonhosted.org/packages/f8/67/8ffdbf7b65ed1ac224d1c2df3943553766914a8ca718747ee3871da6107e/cryptography-46.0.5-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:c69fd885df7d089548a42d5ec05be26050ebcd2283d89b3d30676eb32ff87dee", size = 4270291, upload-time = "2026-02-10T19:17:38.748Z" }, + { url = "https://files.pythonhosted.org/packages/f8/e5/f52377ee93bc2f2bba55a41a886fd208c15276ffbd2569f2ddc89d50e2c5/cryptography-46.0.5-cp314-cp314t-manylinux_2_28_ppc64le.whl", hash = "sha256:8293f3dea7fc929ef7240796ba231413afa7b68ce38fd21da2995549f5961981", size = 4927539, upload-time = "2026-02-10T19:17:40.241Z" }, + { url = "https://files.pythonhosted.org/packages/3b/02/cfe39181b02419bbbbcf3abdd16c1c5c8541f03ca8bda240debc467d5a12/cryptography-46.0.5-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:1abfdb89b41c3be0365328a410baa9df3ff8a9110fb75e7b52e66803ddabc9a9", size = 4442199, upload-time = "2026-02-10T19:17:41.789Z" }, + { url = "https://files.pythonhosted.org/packages/c0/96/2fcaeb4873e536cf71421a388a6c11b5bc846e986b2b069c79363dc1648e/cryptography-46.0.5-cp314-cp314t-manylinux_2_31_armv7l.whl", hash = "sha256:d66e421495fdb797610a08f43b05269e0a5ea7f5e652a89bfd5a7d3c1dee3648", size = 3960131, upload-time = "2026-02-10T19:17:43.379Z" }, + { url = "https://files.pythonhosted.org/packages/d8/d2/b27631f401ddd644e94c5cf33c9a4069f72011821cf3dc7309546b0642a0/cryptography-46.0.5-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:4e817a8920bfbcff8940ecfd60f23d01836408242b30f1a708d93198393a80b4", size = 4270072, upload-time = "2026-02-10T19:17:45.481Z" }, + { url = "https://files.pythonhosted.org/packages/f4/a7/60d32b0370dae0b4ebe55ffa10e8599a2a59935b5ece1b9f06edb73abdeb/cryptography-46.0.5-cp314-cp314t-manylinux_2_34_ppc64le.whl", hash = "sha256:68f68d13f2e1cb95163fa3b4db4bf9a159a418f5f6e7242564fc75fcae667fd0", size = 4892170, upload-time = "2026-02-10T19:17:46.997Z" }, + { url = "https://files.pythonhosted.org/packages/d2/b9/cf73ddf8ef1164330eb0b199a589103c363afa0cf794218c24d524a58eab/cryptography-46.0.5-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:a3d1fae9863299076f05cb8a778c467578262fae09f9dc0ee9b12eb4268ce663", size = 4441741, upload-time = "2026-02-10T19:17:48.661Z" }, + { url = "https://files.pythonhosted.org/packages/5f/eb/eee00b28c84c726fe8fa0158c65afe312d9c3b78d9d01daf700f1f6e37ff/cryptography-46.0.5-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:c4143987a42a2397f2fc3b4d7e3a7d313fbe684f67ff443999e803dd75a76826", size = 4396728, upload-time = "2026-02-10T19:17:50.058Z" }, + { url = "https://files.pythonhosted.org/packages/65/f4/6bc1a9ed5aef7145045114b75b77c2a8261b4d38717bd8dea111a63c3442/cryptography-46.0.5-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:7d731d4b107030987fd61a7f8ab512b25b53cef8f233a97379ede116f30eb67d", size = 4652001, upload-time = "2026-02-10T19:17:51.54Z" }, + { url = "https://files.pythonhosted.org/packages/86/ef/5d00ef966ddd71ac2e6951d278884a84a40ffbd88948ef0e294b214ae9e4/cryptography-46.0.5-cp314-cp314t-win32.whl", hash = "sha256:c3bcce8521d785d510b2aad26ae2c966092b7daa8f45dd8f44734a104dc0bc1a", size = 3003637, upload-time = "2026-02-10T19:17:52.997Z" }, + { url = "https://files.pythonhosted.org/packages/b7/57/f3f4160123da6d098db78350fdfd9705057aad21de7388eacb2401dceab9/cryptography-46.0.5-cp314-cp314t-win_amd64.whl", hash = "sha256:4d8ae8659ab18c65ced284993c2265910f6c9e650189d4e3f68445ef82a810e4", size = 3469487, upload-time = "2026-02-10T19:17:54.549Z" }, + { url = "https://files.pythonhosted.org/packages/e2/fa/a66aa722105ad6a458bebd64086ca2b72cdd361fed31763d20390f6f1389/cryptography-46.0.5-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:4108d4c09fbbf2789d0c926eb4152ae1760d5a2d97612b92d508d96c861e4d31", size = 7170514, upload-time = "2026-02-10T19:17:56.267Z" }, + { url = "https://files.pythonhosted.org/packages/0f/04/c85bdeab78c8bc77b701bf0d9bdcf514c044e18a46dcff330df5448631b0/cryptography-46.0.5-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7d1f30a86d2757199cb2d56e48cce14deddf1f9c95f1ef1b64ee91ea43fe2e18", size = 4275349, upload-time = "2026-02-10T19:17:58.419Z" }, + { url = "https://files.pythonhosted.org/packages/5c/32/9b87132a2f91ee7f5223b091dc963055503e9b442c98fc0b8a5ca765fab0/cryptography-46.0.5-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:039917b0dc418bb9f6edce8a906572d69e74bd330b0b3fea4f79dab7f8ddd235", size = 4420667, upload-time = "2026-02-10T19:18:00.619Z" }, + { url = "https://files.pythonhosted.org/packages/a1/a6/a7cb7010bec4b7c5692ca6f024150371b295ee1c108bdc1c400e4c44562b/cryptography-46.0.5-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:ba2a27ff02f48193fc4daeadf8ad2590516fa3d0adeeb34336b96f7fa64c1e3a", size = 4276980, upload-time = "2026-02-10T19:18:02.379Z" }, + { url = "https://files.pythonhosted.org/packages/8e/7c/c4f45e0eeff9b91e3f12dbd0e165fcf2a38847288fcfd889deea99fb7b6d/cryptography-46.0.5-cp38-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:61aa400dce22cb001a98014f647dc21cda08f7915ceb95df0c9eaf84b4b6af76", size = 4939143, upload-time = "2026-02-10T19:18:03.964Z" }, + { url = "https://files.pythonhosted.org/packages/37/19/e1b8f964a834eddb44fa1b9a9976f4e414cbb7aa62809b6760c8803d22d1/cryptography-46.0.5-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:3ce58ba46e1bc2aac4f7d9290223cead56743fa6ab94a5d53292ffaac6a91614", size = 4453674, upload-time = "2026-02-10T19:18:05.588Z" }, + { url = "https://files.pythonhosted.org/packages/db/ed/db15d3956f65264ca204625597c410d420e26530c4e2943e05a0d2f24d51/cryptography-46.0.5-cp38-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:420d0e909050490d04359e7fdb5ed7e667ca5c3c402b809ae2563d7e66a92229", size = 3978801, upload-time = "2026-02-10T19:18:07.167Z" }, + { url = "https://files.pythonhosted.org/packages/41/e2/df40a31d82df0a70a0daf69791f91dbb70e47644c58581d654879b382d11/cryptography-46.0.5-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:582f5fcd2afa31622f317f80426a027f30dc792e9c80ffee87b993200ea115f1", size = 4276755, upload-time = "2026-02-10T19:18:09.813Z" }, + { url = "https://files.pythonhosted.org/packages/33/45/726809d1176959f4a896b86907b98ff4391a8aa29c0aaaf9450a8a10630e/cryptography-46.0.5-cp38-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:bfd56bb4b37ed4f330b82402f6f435845a5f5648edf1ad497da51a8452d5d62d", size = 4901539, upload-time = "2026-02-10T19:18:11.263Z" }, + { url = "https://files.pythonhosted.org/packages/99/0f/a3076874e9c88ecb2ecc31382f6e7c21b428ede6f55aafa1aa272613e3cd/cryptography-46.0.5-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:a3d507bb6a513ca96ba84443226af944b0f7f47dcc9a399d110cd6146481d24c", size = 4452794, upload-time = "2026-02-10T19:18:12.914Z" }, + { url = "https://files.pythonhosted.org/packages/02/ef/ffeb542d3683d24194a38f66ca17c0a4b8bf10631feef44a7ef64e631b1a/cryptography-46.0.5-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:9f16fbdf4da055efb21c22d81b89f155f02ba420558db21288b3d0035bafd5f4", size = 4404160, upload-time = "2026-02-10T19:18:14.375Z" }, + { url = "https://files.pythonhosted.org/packages/96/93/682d2b43c1d5f1406ed048f377c0fc9fc8f7b0447a478d5c65ab3d3a66eb/cryptography-46.0.5-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:ced80795227d70549a411a4ab66e8ce307899fad2220ce5ab2f296e687eacde9", size = 4667123, upload-time = "2026-02-10T19:18:15.886Z" }, + { url = "https://files.pythonhosted.org/packages/45/2d/9c5f2926cb5300a8eefc3f4f0b3f3df39db7f7ce40c8365444c49363cbda/cryptography-46.0.5-cp38-abi3-win32.whl", hash = "sha256:02f547fce831f5096c9a567fd41bc12ca8f11df260959ecc7c3202555cc47a72", size = 3010220, upload-time = "2026-02-10T19:18:17.361Z" }, + { url = "https://files.pythonhosted.org/packages/48/ef/0c2f4a8e31018a986949d34a01115dd057bf536905dca38897bacd21fac3/cryptography-46.0.5-cp38-abi3-win_amd64.whl", hash = "sha256:556e106ee01aa13484ce9b0239bca667be5004efb0aabbed28d353df86445595", size = 3467050, upload-time = "2026-02-10T19:18:18.899Z" }, ] [[package]] @@ -1519,7 +1551,7 @@ grpc = [ [[package]] name = "gradio" -version = "6.1.0" +version = "6.9.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "aiofiles" }, @@ -1542,6 +1574,7 @@ dependencies = [ { name = "pydantic" }, { name = "pydub" }, { name = "python-multipart" }, + { name = "pytz" }, { name = "pyyaml" }, { name = "safehttpx" }, { name = "semantic-version" }, @@ -1551,14 +1584,14 @@ dependencies = [ { name = "typing-extensions" }, { name = "uvicorn" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/4b/cb/ce9c99e4026c7daefef2fe6736207a9189571ddefc1277438103e3e413f2/gradio-6.1.0.tar.gz", hash = "sha256:fe9f6757d53ce7840b487a6921151d8c3410f7de6e2152a4407c5eded9ce023a", size = 37852914, upload-time = "2025-12-09T19:31:53.996Z" } +sdist = { url = "https://files.pythonhosted.org/packages/bd/83/29bdbf94b212512e3c775482d390f5b699a72d71a2c431dea367a6e45a37/gradio-6.9.0.tar.gz", hash = "sha256:593e60e33233f3586452ebfa9f741817c5ae849a98cc70945f3ccb8dc895eb22", size = 57904480, upload-time = "2026-03-06T17:44:26.025Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/ae/00/592f02d2f8a815fc3370f3cda70fb2116d6ec31cf3fe33c87fd34d0a1778/gradio-6.1.0-py3-none-any.whl", hash = "sha256:528f17d75c8206da77a4646955678df8a786145b7bdfcba61d14b2fb3cb94b98", size = 22967810, upload-time = "2025-12-09T19:31:51.335Z" }, + { url = "https://files.pythonhosted.org/packages/b3/8b/dc357ab966544e4dc898a2fee326d755c5f54da82af71a1a802e3476e78e/gradio-6.9.0-py3-none-any.whl", hash = "sha256:c173dd330c9247002a42222c85d76c0ecee65437eff808084e360862e7bbd24f", size = 42940853, upload-time = "2026-03-06T17:44:22.009Z" }, ] [[package]] name = "gradio-client" -version = "2.0.1" +version = "2.3.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "fsspec" }, @@ -1567,9 +1600,9 @@ dependencies = [ { name = "packaging" }, { name = "typing-extensions" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/4e/cc/b0f04b1c9bf79c7ae9840b9945f5fbd93355719684f83032837695ab1eaf/gradio_client-2.0.1.tar.gz", hash = "sha256:087eb50652370747c0ce66cd0ae79ecb49f9682188d5348e279d44602cbc2814", size = 54792, upload-time = "2025-12-02T01:57:58.685Z" } +sdist = { url = "https://files.pythonhosted.org/packages/97/d2/de2037f5eff13a5145cdf6982fd34c9735f0806e8a2ee5d4bfe9a7d25a54/gradio_client-2.3.0.tar.gz", hash = "sha256:1c700dc60e65bae4386ba7cf3732b9f9d5bcf5fb8eb451df3944fe092d7d9a29", size = 57552, upload-time = "2026-03-06T17:44:38.247Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/1b/11/d680ecf4073bd1cacfe9dea57fa95660e4ea2d1fff3125dbaaa902cc9095/gradio_client-2.0.1-py3-none-any.whl", hash = "sha256:6322eecb5963a07703306c0b048bb98518063d05ca99a65fe384417188af8c63", size = 55439, upload-time = "2025-12-02T01:57:57.551Z" }, + { url = "https://files.pythonhosted.org/packages/99/6a/41752781399811afbf8ac858f63c20eff354ed35169daa39604aefced4e8/gradio_client-2.3.0-py3-none-any.whl", hash = "sha256:9ec51a927888fc188e123a0ac5ad341d9265b325539a399554d1fc2604942e74", size = 58531, upload-time = "2026-03-06T17:44:36.961Z" }, ] [[package]] @@ -2903,7 +2936,7 @@ wheels = [ [[package]] name = "nbconvert" -version = "7.16.6" +version = "7.17.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "beautifulsoup4" }, @@ -2921,9 +2954,9 @@ dependencies = [ { name = "pygments" }, { name = "traitlets" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/a3/59/f28e15fc47ffb73af68a8d9b47367a8630d76e97ae85ad18271b9db96fdf/nbconvert-7.16.6.tar.gz", hash = "sha256:576a7e37c6480da7b8465eefa66c17844243816ce1ccc372633c6b71c3c0f582", size = 857715, upload-time = "2025-01-28T09:29:14.724Z" } +sdist = { url = "https://files.pythonhosted.org/packages/38/47/81f886b699450d0569f7bc551df2b1673d18df7ff25cc0c21ca36ed8a5ff/nbconvert-7.17.0.tar.gz", hash = "sha256:1b2696f1b5be12309f6c7d707c24af604b87dfaf6d950794c7b07acab96dda78", size = 862855, upload-time = "2026-01-29T16:37:48.478Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/cc/9a/cd673b2f773a12c992f41309ef81b99da1690426bd2f96957a7ade0d3ed7/nbconvert-7.16.6-py3-none-any.whl", hash = "sha256:1375a7b67e0c2883678c48e506dc320febb57685e5ee67faa51b18a90f3a712b", size = 258525, upload-time = "2025-01-28T09:29:12.551Z" }, + { url = "https://files.pythonhosted.org/packages/0d/4b/8d5f796a792f8a25f6925a96032f098789f448571eb92011df1ae59e8ea8/nbconvert-7.17.0-py3-none-any.whl", hash = "sha256:4f99a63b337b9a23504347afdab24a11faa7d86b405e5c8f9881cd313336d518", size = 261510, upload-time = "2026-01-29T16:37:46.322Z" }, ] [[package]] @@ -3201,55 +3234,55 @@ wheels = [ [[package]] name = "orjson" -version = "3.11.4" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/c6/fe/ed708782d6709cc60eb4c2d8a361a440661f74134675c72990f2c48c785f/orjson-3.11.4.tar.gz", hash = "sha256:39485f4ab4c9b30a3943cfe99e1a213c4776fb69e8abd68f66b83d5a0b0fdc6d", size = 5945188, upload-time = "2025-10-24T15:50:38.027Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/63/51/6b556192a04595b93e277a9ff71cd0cc06c21a7df98bcce5963fa0f5e36f/orjson-3.11.4-cp312-cp312-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:d4371de39319d05d3f482f372720b841c841b52f5385bd99c61ed69d55d9ab50", size = 243571, upload-time = "2025-10-24T15:49:10.008Z" }, - { url = "https://files.pythonhosted.org/packages/1c/2c/2602392ddf2601d538ff11848b98621cd465d1a1ceb9db9e8043181f2f7b/orjson-3.11.4-cp312-cp312-macosx_15_0_arm64.whl", hash = "sha256:e41fd3b3cac850eaae78232f37325ed7d7436e11c471246b87b2cd294ec94853", size = 128891, upload-time = "2025-10-24T15:49:11.297Z" }, - { url = "https://files.pythonhosted.org/packages/4e/47/bf85dcf95f7a3a12bf223394a4f849430acd82633848d52def09fa3f46ad/orjson-3.11.4-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:600e0e9ca042878c7fdf189cf1b028fe2c1418cc9195f6cb9824eb6ed99cb938", size = 130137, upload-time = "2025-10-24T15:49:12.544Z" }, - { url = "https://files.pythonhosted.org/packages/b4/4d/a0cb31007f3ab6f1fd2a1b17057c7c349bc2baf8921a85c0180cc7be8011/orjson-3.11.4-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7bbf9b333f1568ef5da42bc96e18bf30fd7f8d54e9ae066d711056add508e415", size = 129152, upload-time = "2025-10-24T15:49:13.754Z" }, - { url = "https://files.pythonhosted.org/packages/f7/ef/2811def7ce3d8576b19e3929fff8f8f0d44bc5eb2e0fdecb2e6e6cc6c720/orjson-3.11.4-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4806363144bb6e7297b8e95870e78d30a649fdc4e23fc84daa80c8ebd366ce44", size = 136834, upload-time = "2025-10-24T15:49:15.307Z" }, - { url = "https://files.pythonhosted.org/packages/00/d4/9aee9e54f1809cec8ed5abd9bc31e8a9631d19460e3b8470145d25140106/orjson-3.11.4-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ad355e8308493f527d41154e9053b86a5be892b3b359a5c6d5d95cda23601cb2", size = 137519, upload-time = "2025-10-24T15:49:16.557Z" }, - { url = "https://files.pythonhosted.org/packages/db/ea/67bfdb5465d5679e8ae8d68c11753aaf4f47e3e7264bad66dc2f2249e643/orjson-3.11.4-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c8a7517482667fb9f0ff1b2f16fe5829296ed7a655d04d68cd9711a4d8a4e708", size = 136749, upload-time = "2025-10-24T15:49:17.796Z" }, - { url = "https://files.pythonhosted.org/packages/01/7e/62517dddcfce6d53a39543cd74d0dccfcbdf53967017c58af68822100272/orjson-3.11.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:97eb5942c7395a171cbfecc4ef6701fc3c403e762194683772df4c54cfbb2210", size = 136325, upload-time = "2025-10-24T15:49:19.347Z" }, - { url = "https://files.pythonhosted.org/packages/18/ae/40516739f99ab4c7ec3aaa5cc242d341fcb03a45d89edeeaabc5f69cb2cf/orjson-3.11.4-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:149d95d5e018bdd822e3f38c103b1a7c91f88d38a88aada5c4e9b3a73a244241", size = 140204, upload-time = "2025-10-24T15:49:20.545Z" }, - { url = "https://files.pythonhosted.org/packages/82/18/ff5734365623a8916e3a4037fcef1cd1782bfc14cf0992afe7940c5320bf/orjson-3.11.4-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:624f3951181eb46fc47dea3d221554e98784c823e7069edb5dbd0dc826ac909b", size = 406242, upload-time = "2025-10-24T15:49:21.884Z" }, - { url = "https://files.pythonhosted.org/packages/e1/43/96436041f0a0c8c8deca6a05ebeaf529bf1de04839f93ac5e7c479807aec/orjson-3.11.4-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:03bfa548cf35e3f8b3a96c4e8e41f753c686ff3d8e182ce275b1751deddab58c", size = 150013, upload-time = "2025-10-24T15:49:23.185Z" }, - { url = "https://files.pythonhosted.org/packages/1b/48/78302d98423ed8780479a1e682b9aecb869e8404545d999d34fa486e573e/orjson-3.11.4-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:525021896afef44a68148f6ed8a8bf8375553d6066c7f48537657f64823565b9", size = 139951, upload-time = "2025-10-24T15:49:24.428Z" }, - { url = "https://files.pythonhosted.org/packages/4a/7b/ad613fdcdaa812f075ec0875143c3d37f8654457d2af17703905425981bf/orjson-3.11.4-cp312-cp312-win32.whl", hash = "sha256:b58430396687ce0f7d9eeb3dd47761ca7d8fda8e9eb92b3077a7a353a75efefa", size = 136049, upload-time = "2025-10-24T15:49:25.973Z" }, - { url = "https://files.pythonhosted.org/packages/b9/3c/9cf47c3ff5f39b8350fb21ba65d789b6a1129d4cbb3033ba36c8a9023520/orjson-3.11.4-cp312-cp312-win_amd64.whl", hash = "sha256:c6dbf422894e1e3c80a177133c0dda260f81428f9de16d61041949f6a2e5c140", size = 131461, upload-time = "2025-10-24T15:49:27.259Z" }, - { url = "https://files.pythonhosted.org/packages/c6/3b/e2425f61e5825dc5b08c2a5a2b3af387eaaca22a12b9c8c01504f8614c36/orjson-3.11.4-cp312-cp312-win_arm64.whl", hash = "sha256:d38d2bc06d6415852224fcc9c0bfa834c25431e466dc319f0edd56cca81aa96e", size = 126167, upload-time = "2025-10-24T15:49:28.511Z" }, - { url = "https://files.pythonhosted.org/packages/23/15/c52aa7112006b0f3d6180386c3a46ae057f932ab3425bc6f6ac50431cca1/orjson-3.11.4-cp313-cp313-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:2d6737d0e616a6e053c8b4acc9eccea6b6cce078533666f32d140e4f85002534", size = 243525, upload-time = "2025-10-24T15:49:29.737Z" }, - { url = "https://files.pythonhosted.org/packages/ec/38/05340734c33b933fd114f161f25a04e651b0c7c33ab95e9416ade5cb44b8/orjson-3.11.4-cp313-cp313-macosx_15_0_arm64.whl", hash = "sha256:afb14052690aa328cc118a8e09f07c651d301a72e44920b887c519b313d892ff", size = 128871, upload-time = "2025-10-24T15:49:31.109Z" }, - { url = "https://files.pythonhosted.org/packages/55/b9/ae8d34899ff0c012039b5a7cb96a389b2476e917733294e498586b45472d/orjson-3.11.4-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:38aa9e65c591febb1b0aed8da4d469eba239d434c218562df179885c94e1a3ad", size = 130055, upload-time = "2025-10-24T15:49:33.382Z" }, - { url = "https://files.pythonhosted.org/packages/33/aa/6346dd5073730451bee3681d901e3c337e7ec17342fb79659ec9794fc023/orjson-3.11.4-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:f2cf4dfaf9163b0728d061bebc1e08631875c51cd30bf47cb9e3293bfbd7dcd5", size = 129061, upload-time = "2025-10-24T15:49:34.935Z" }, - { url = "https://files.pythonhosted.org/packages/39/e4/8eea51598f66a6c853c380979912d17ec510e8e66b280d968602e680b942/orjson-3.11.4-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:89216ff3dfdde0e4070932e126320a1752c9d9a758d6a32ec54b3b9334991a6a", size = 136541, upload-time = "2025-10-24T15:49:36.923Z" }, - { url = "https://files.pythonhosted.org/packages/9a/47/cb8c654fa9adcc60e99580e17c32b9e633290e6239a99efa6b885aba9dbc/orjson-3.11.4-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9daa26ca8e97fae0ce8aa5d80606ef8f7914e9b129b6b5df9104266f764ce436", size = 137535, upload-time = "2025-10-24T15:49:38.307Z" }, - { url = "https://files.pythonhosted.org/packages/43/92/04b8cc5c2b729f3437ee013ce14a60ab3d3001465d95c184758f19362f23/orjson-3.11.4-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5c8b2769dc31883c44a9cd126560327767f848eb95f99c36c9932f51090bfce9", size = 136703, upload-time = "2025-10-24T15:49:40.795Z" }, - { url = "https://files.pythonhosted.org/packages/aa/fd/d0733fcb9086b8be4ebcfcda2d0312865d17d0d9884378b7cffb29d0763f/orjson-3.11.4-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1469d254b9884f984026bd9b0fa5bbab477a4bfe558bba6848086f6d43eb5e73", size = 136293, upload-time = "2025-10-24T15:49:42.347Z" }, - { url = "https://files.pythonhosted.org/packages/c2/d7/3c5514e806837c210492d72ae30ccf050ce3f940f45bf085bab272699ef4/orjson-3.11.4-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:68e44722541983614e37117209a194e8c3ad07838ccb3127d96863c95ec7f1e0", size = 140131, upload-time = "2025-10-24T15:49:43.638Z" }, - { url = "https://files.pythonhosted.org/packages/9c/dd/ba9d32a53207babf65bd510ac4d0faaa818bd0df9a9c6f472fe7c254f2e3/orjson-3.11.4-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:8e7805fda9672c12be2f22ae124dcd7b03928d6c197544fe12174b86553f3196", size = 406164, upload-time = "2025-10-24T15:49:45.498Z" }, - { url = "https://files.pythonhosted.org/packages/8e/f9/f68ad68f4af7c7bde57cd514eaa2c785e500477a8bc8f834838eb696a685/orjson-3.11.4-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:04b69c14615fb4434ab867bf6f38b2d649f6f300af30a6705397e895f7aec67a", size = 149859, upload-time = "2025-10-24T15:49:46.981Z" }, - { url = "https://files.pythonhosted.org/packages/b6/d2/7f847761d0c26818395b3d6b21fb6bc2305d94612a35b0a30eae65a22728/orjson-3.11.4-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:639c3735b8ae7f970066930e58cf0ed39a852d417c24acd4a25fc0b3da3c39a6", size = 139926, upload-time = "2025-10-24T15:49:48.321Z" }, - { url = "https://files.pythonhosted.org/packages/9f/37/acd14b12dc62db9a0e1d12386271b8661faae270b22492580d5258808975/orjson-3.11.4-cp313-cp313-win32.whl", hash = "sha256:6c13879c0d2964335491463302a6ca5ad98105fc5db3565499dcb80b1b4bd839", size = 136007, upload-time = "2025-10-24T15:49:49.938Z" }, - { url = "https://files.pythonhosted.org/packages/c0/a9/967be009ddf0a1fffd7a67de9c36656b28c763659ef91352acc02cbe364c/orjson-3.11.4-cp313-cp313-win_amd64.whl", hash = "sha256:09bf242a4af98732db9f9a1ec57ca2604848e16f132e3f72edfd3c5c96de009a", size = 131314, upload-time = "2025-10-24T15:49:51.248Z" }, - { url = "https://files.pythonhosted.org/packages/cb/db/399abd6950fbd94ce125cb8cd1a968def95174792e127b0642781e040ed4/orjson-3.11.4-cp313-cp313-win_arm64.whl", hash = "sha256:a85f0adf63319d6c1ba06fb0dbf997fced64a01179cf17939a6caca662bf92de", size = 126152, upload-time = "2025-10-24T15:49:52.922Z" }, - { url = "https://files.pythonhosted.org/packages/25/e3/54ff63c093cc1697e758e4fceb53164dd2661a7d1bcd522260ba09f54533/orjson-3.11.4-cp314-cp314-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:42d43a1f552be1a112af0b21c10a5f553983c2a0938d2bbb8ecd8bc9fb572803", size = 243501, upload-time = "2025-10-24T15:49:54.288Z" }, - { url = "https://files.pythonhosted.org/packages/ac/7d/e2d1076ed2e8e0ae9badca65bf7ef22710f93887b29eaa37f09850604e09/orjson-3.11.4-cp314-cp314-macosx_15_0_arm64.whl", hash = "sha256:26a20f3fbc6c7ff2cb8e89c4c5897762c9d88cf37330c6a117312365d6781d54", size = 128862, upload-time = "2025-10-24T15:49:55.961Z" }, - { url = "https://files.pythonhosted.org/packages/9f/37/ca2eb40b90621faddfa9517dfe96e25f5ae4d8057a7c0cdd613c17e07b2c/orjson-3.11.4-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6e3f20be9048941c7ffa8fc523ccbd17f82e24df1549d1d1fe9317712d19938e", size = 130047, upload-time = "2025-10-24T15:49:57.406Z" }, - { url = "https://files.pythonhosted.org/packages/c7/62/1021ed35a1f2bad9040f05fa4cc4f9893410df0ba3eaa323ccf899b1c90a/orjson-3.11.4-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:aac364c758dc87a52e68e349924d7e4ded348dedff553889e4d9f22f74785316", size = 129073, upload-time = "2025-10-24T15:49:58.782Z" }, - { url = "https://files.pythonhosted.org/packages/e8/3f/f84d966ec2a6fd5f73b1a707e7cd876813422ae4bf9f0145c55c9c6a0f57/orjson-3.11.4-cp314-cp314-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d5c54a6d76e3d741dcc3f2707f8eeb9ba2a791d3adbf18f900219b62942803b1", size = 136597, upload-time = "2025-10-24T15:50:00.12Z" }, - { url = "https://files.pythonhosted.org/packages/32/78/4fa0aeca65ee82bbabb49e055bd03fa4edea33f7c080c5c7b9601661ef72/orjson-3.11.4-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f28485bdca8617b79d44627f5fb04336897041dfd9fa66d383a49d09d86798bc", size = 137515, upload-time = "2025-10-24T15:50:01.57Z" }, - { url = "https://files.pythonhosted.org/packages/c1/9d/0c102e26e7fde40c4c98470796d050a2ec1953897e2c8ab0cb95b0759fa2/orjson-3.11.4-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:bfc2a484cad3585e4ba61985a6062a4c2ed5c7925db6d39f1fa267c9d166487f", size = 136703, upload-time = "2025-10-24T15:50:02.944Z" }, - { url = "https://files.pythonhosted.org/packages/df/ac/2de7188705b4cdfaf0b6c97d2f7849c17d2003232f6e70df98602173f788/orjson-3.11.4-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e34dbd508cb91c54f9c9788923daca129fe5b55c5b4eebe713bf5ed3791280cf", size = 136311, upload-time = "2025-10-24T15:50:04.441Z" }, - { url = "https://files.pythonhosted.org/packages/e0/52/847fcd1a98407154e944feeb12e3b4d487a0e264c40191fb44d1269cbaa1/orjson-3.11.4-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:b13c478fa413d4b4ee606ec8e11c3b2e52683a640b006bb586b3041c2ca5f606", size = 140127, upload-time = "2025-10-24T15:50:07.398Z" }, - { url = "https://files.pythonhosted.org/packages/c1/ae/21d208f58bdb847dd4d0d9407e2929862561841baa22bdab7aea10ca088e/orjson-3.11.4-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:724ca721ecc8a831b319dcd72cfa370cc380db0bf94537f08f7edd0a7d4e1780", size = 406201, upload-time = "2025-10-24T15:50:08.796Z" }, - { url = "https://files.pythonhosted.org/packages/8d/55/0789d6de386c8366059db098a628e2ad8798069e94409b0d8935934cbcb9/orjson-3.11.4-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:977c393f2e44845ce1b540e19a786e9643221b3323dae190668a98672d43fb23", size = 149872, upload-time = "2025-10-24T15:50:10.234Z" }, - { url = "https://files.pythonhosted.org/packages/cc/1d/7ff81ea23310e086c17b41d78a72270d9de04481e6113dbe2ac19118f7fb/orjson-3.11.4-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:1e539e382cf46edec157ad66b0b0872a90d829a6b71f17cb633d6c160a223155", size = 139931, upload-time = "2025-10-24T15:50:11.623Z" }, - { url = "https://files.pythonhosted.org/packages/77/92/25b886252c50ed64be68c937b562b2f2333b45afe72d53d719e46a565a50/orjson-3.11.4-cp314-cp314-win32.whl", hash = "sha256:d63076d625babab9db5e7836118bdfa086e60f37d8a174194ae720161eb12394", size = 136065, upload-time = "2025-10-24T15:50:13.025Z" }, - { url = "https://files.pythonhosted.org/packages/63/b8/718eecf0bb7e9d64e4956afaafd23db9f04c776d445f59fe94f54bdae8f0/orjson-3.11.4-cp314-cp314-win_amd64.whl", hash = "sha256:0a54d6635fa3aaa438ae32e8570b9f0de36f3f6562c308d2a2a452e8b0592db1", size = 131310, upload-time = "2025-10-24T15:50:14.46Z" }, - { url = "https://files.pythonhosted.org/packages/1a/bf/def5e25d4d8bfce296a9a7c8248109bf58622c21618b590678f945a2c59c/orjson-3.11.4-cp314-cp314-win_arm64.whl", hash = "sha256:78b999999039db3cf58f6d230f524f04f75f129ba3d1ca2ed121f8657e575d3d", size = 126151, upload-time = "2025-10-24T15:50:15.878Z" }, +version = "3.11.5" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/04/b8/333fdb27840f3bf04022d21b654a35f58e15407183aeb16f3b41aa053446/orjson-3.11.5.tar.gz", hash = "sha256:82393ab47b4fe44ffd0a7659fa9cfaacc717eb617c93cde83795f14af5c2e9d5", size = 5972347, upload-time = "2025-12-06T15:55:39.458Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ef/a4/8052a029029b096a78955eadd68ab594ce2197e24ec50e6b6d2ab3f4e33b/orjson-3.11.5-cp312-cp312-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:334e5b4bff9ad101237c2d799d9fd45737752929753bf4faf4b207335a416b7d", size = 245347, upload-time = "2025-12-06T15:54:22.061Z" }, + { url = "https://files.pythonhosted.org/packages/64/67/574a7732bd9d9d79ac620c8790b4cfe0717a3d5a6eb2b539e6e8995e24a0/orjson-3.11.5-cp312-cp312-macosx_15_0_arm64.whl", hash = "sha256:ff770589960a86eae279f5d8aa536196ebda8273a2a07db2a54e82b93bc86626", size = 129435, upload-time = "2025-12-06T15:54:23.615Z" }, + { url = "https://files.pythonhosted.org/packages/52/8d/544e77d7a29d90cf4d9eecd0ae801c688e7f3d1adfa2ebae5e1e94d38ab9/orjson-3.11.5-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ed24250e55efbcb0b35bed7caaec8cedf858ab2f9f2201f17b8938c618c8ca6f", size = 132074, upload-time = "2025-12-06T15:54:24.694Z" }, + { url = "https://files.pythonhosted.org/packages/6e/57/b9f5b5b6fbff9c26f77e785baf56ae8460ef74acdb3eae4931c25b8f5ba9/orjson-3.11.5-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a66d7769e98a08a12a139049aac2f0ca3adae989817f8c43337455fbc7669b85", size = 130520, upload-time = "2025-12-06T15:54:26.185Z" }, + { url = "https://files.pythonhosted.org/packages/f6/6d/d34970bf9eb33f9ec7c979a262cad86076814859e54eb9a059a52f6dc13d/orjson-3.11.5-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:86cfc555bfd5794d24c6a1903e558b50644e5e68e6471d66502ce5cb5fdef3f9", size = 136209, upload-time = "2025-12-06T15:54:27.264Z" }, + { url = "https://files.pythonhosted.org/packages/e7/39/bc373b63cc0e117a105ea12e57280f83ae52fdee426890d57412432d63b3/orjson-3.11.5-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a230065027bc2a025e944f9d4714976a81e7ecfa940923283bca7bbc1f10f626", size = 139837, upload-time = "2025-12-06T15:54:28.75Z" }, + { url = "https://files.pythonhosted.org/packages/cb/aa/7c4818c8d7d324da220f4f1af55c343956003aa4d1ce1857bdc1d396ba69/orjson-3.11.5-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b29d36b60e606df01959c4b982729c8845c69d1963f88686608be9ced96dbfaa", size = 137307, upload-time = "2025-12-06T15:54:29.856Z" }, + { url = "https://files.pythonhosted.org/packages/46/bf/0993b5a056759ba65145effe3a79dd5a939d4a070eaa5da2ee3180fbb13f/orjson-3.11.5-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c74099c6b230d4261fdc3169d50efc09abf38ace1a42ea2f9994b1d79153d477", size = 139020, upload-time = "2025-12-06T15:54:31.024Z" }, + { url = "https://files.pythonhosted.org/packages/65/e8/83a6c95db3039e504eda60fc388f9faedbb4f6472f5aba7084e06552d9aa/orjson-3.11.5-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:e697d06ad57dd0c7a737771d470eedc18e68dfdefcdd3b7de7f33dfda5b6212e", size = 141099, upload-time = "2025-12-06T15:54:32.196Z" }, + { url = "https://files.pythonhosted.org/packages/b9/b4/24fdc024abfce31c2f6812973b0a693688037ece5dc64b7a60c1ce69e2f2/orjson-3.11.5-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:e08ca8a6c851e95aaecc32bc44a5aa75d0ad26af8cdac7c77e4ed93acf3d5b69", size = 413540, upload-time = "2025-12-06T15:54:33.361Z" }, + { url = "https://files.pythonhosted.org/packages/d9/37/01c0ec95d55ed0c11e4cae3e10427e479bba40c77312b63e1f9665e0737d/orjson-3.11.5-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:e8b5f96c05fce7d0218df3fdfeb962d6b8cfff7e3e20264306b46dd8b217c0f3", size = 151530, upload-time = "2025-12-06T15:54:34.6Z" }, + { url = "https://files.pythonhosted.org/packages/f9/d4/f9ebc57182705bb4bbe63f5bbe14af43722a2533135e1d2fb7affa0c355d/orjson-3.11.5-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ddbfdb5099b3e6ba6d6ea818f61997bb66de14b411357d24c4612cf1ebad08ca", size = 141863, upload-time = "2025-12-06T15:54:35.801Z" }, + { url = "https://files.pythonhosted.org/packages/0d/04/02102b8d19fdcb009d72d622bb5781e8f3fae1646bf3e18c53d1bc8115b5/orjson-3.11.5-cp312-cp312-win32.whl", hash = "sha256:9172578c4eb09dbfcf1657d43198de59b6cef4054de385365060ed50c458ac98", size = 135255, upload-time = "2025-12-06T15:54:37.209Z" }, + { url = "https://files.pythonhosted.org/packages/d4/fb/f05646c43d5450492cb387de5549f6de90a71001682c17882d9f66476af5/orjson-3.11.5-cp312-cp312-win_amd64.whl", hash = "sha256:2b91126e7b470ff2e75746f6f6ee32b9ab67b7a93c8ba1d15d3a0caaf16ec875", size = 133252, upload-time = "2025-12-06T15:54:38.401Z" }, + { url = "https://files.pythonhosted.org/packages/dc/a6/7b8c0b26ba18c793533ac1cd145e131e46fcf43952aa94c109b5b913c1f0/orjson-3.11.5-cp312-cp312-win_arm64.whl", hash = "sha256:acbc5fac7e06777555b0722b8ad5f574739e99ffe99467ed63da98f97f9ca0fe", size = 126777, upload-time = "2025-12-06T15:54:39.515Z" }, + { url = "https://files.pythonhosted.org/packages/10/43/61a77040ce59f1569edf38f0b9faadc90c8cf7e9bec2e0df51d0132c6bb7/orjson-3.11.5-cp313-cp313-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:3b01799262081a4c47c035dd77c1301d40f568f77cc7ec1bb7db5d63b0a01629", size = 245271, upload-time = "2025-12-06T15:54:40.878Z" }, + { url = "https://files.pythonhosted.org/packages/55/f9/0f79be617388227866d50edd2fd320cb8fb94dc1501184bb1620981a0aba/orjson-3.11.5-cp313-cp313-macosx_15_0_arm64.whl", hash = "sha256:61de247948108484779f57a9f406e4c84d636fa5a59e411e6352484985e8a7c3", size = 129422, upload-time = "2025-12-06T15:54:42.403Z" }, + { url = "https://files.pythonhosted.org/packages/77/42/f1bf1549b432d4a78bfa95735b79b5dac75b65b5bb815bba86ad406ead0a/orjson-3.11.5-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:894aea2e63d4f24a7f04a1908307c738d0dce992e9249e744b8f4e8dd9197f39", size = 132060, upload-time = "2025-12-06T15:54:43.531Z" }, + { url = "https://files.pythonhosted.org/packages/25/49/825aa6b929f1a6ed244c78acd7b22c1481fd7e5fda047dc8bf4c1a807eb6/orjson-3.11.5-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ddc21521598dbe369d83d4d40338e23d4101dad21dae0e79fa20465dbace019f", size = 130391, upload-time = "2025-12-06T15:54:45.059Z" }, + { url = "https://files.pythonhosted.org/packages/42/ec/de55391858b49e16e1aa8f0bbbb7e5997b7345d8e984a2dec3746d13065b/orjson-3.11.5-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7cce16ae2f5fb2c53c3eafdd1706cb7b6530a67cc1c17abe8ec747f5cd7c0c51", size = 135964, upload-time = "2025-12-06T15:54:46.576Z" }, + { url = "https://files.pythonhosted.org/packages/1c/40/820bc63121d2d28818556a2d0a09384a9f0262407cf9fa305e091a8048df/orjson-3.11.5-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e46c762d9f0e1cfb4ccc8515de7f349abbc95b59cb5a2bd68df5973fdef913f8", size = 139817, upload-time = "2025-12-06T15:54:48.084Z" }, + { url = "https://files.pythonhosted.org/packages/09/c7/3a445ca9a84a0d59d26365fd8898ff52bdfcdcb825bcc6519830371d2364/orjson-3.11.5-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d7345c759276b798ccd6d77a87136029e71e66a8bbf2d2755cbdde1d82e78706", size = 137336, upload-time = "2025-12-06T15:54:49.426Z" }, + { url = "https://files.pythonhosted.org/packages/9a/b3/dc0d3771f2e5d1f13368f56b339c6782f955c6a20b50465a91acb79fe961/orjson-3.11.5-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:75bc2e59e6a2ac1dd28901d07115abdebc4563b5b07dd612bf64260a201b1c7f", size = 138993, upload-time = "2025-12-06T15:54:50.939Z" }, + { url = "https://files.pythonhosted.org/packages/d1/a2/65267e959de6abe23444659b6e19c888f242bf7725ff927e2292776f6b89/orjson-3.11.5-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:54aae9b654554c3b4edd61896b978568c6daa16af96fa4681c9b5babd469f863", size = 141070, upload-time = "2025-12-06T15:54:52.414Z" }, + { url = "https://files.pythonhosted.org/packages/63/c9/da44a321b288727a322c6ab17e1754195708786a04f4f9d2220a5076a649/orjson-3.11.5-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:4bdd8d164a871c4ec773f9de0f6fe8769c2d6727879c37a9666ba4183b7f8228", size = 413505, upload-time = "2025-12-06T15:54:53.67Z" }, + { url = "https://files.pythonhosted.org/packages/7f/17/68dc14fa7000eefb3d4d6d7326a190c99bb65e319f02747ef3ebf2452f12/orjson-3.11.5-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:a261fef929bcf98a60713bf5e95ad067cea16ae345d9a35034e73c3990e927d2", size = 151342, upload-time = "2025-12-06T15:54:55.113Z" }, + { url = "https://files.pythonhosted.org/packages/c4/c5/ccee774b67225bed630a57478529fc026eda33d94fe4c0eac8fe58d4aa52/orjson-3.11.5-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:c028a394c766693c5c9909dec76b24f37e6a1b91999e8d0c0d5feecbe93c3e05", size = 141823, upload-time = "2025-12-06T15:54:56.331Z" }, + { url = "https://files.pythonhosted.org/packages/67/80/5d00e4155d0cd7390ae2087130637671da713959bb558db9bac5e6f6b042/orjson-3.11.5-cp313-cp313-win32.whl", hash = "sha256:2cc79aaad1dfabe1bd2d50ee09814a1253164b3da4c00a78c458d82d04b3bdef", size = 135236, upload-time = "2025-12-06T15:54:57.507Z" }, + { url = "https://files.pythonhosted.org/packages/95/fe/792cc06a84808dbdc20ac6eab6811c53091b42f8e51ecebf14b540e9cfe4/orjson-3.11.5-cp313-cp313-win_amd64.whl", hash = "sha256:ff7877d376add4e16b274e35a3f58b7f37b362abf4aa31863dadacdd20e3a583", size = 133167, upload-time = "2025-12-06T15:54:58.71Z" }, + { url = "https://files.pythonhosted.org/packages/46/2c/d158bd8b50e3b1cfdcf406a7e463f6ffe3f0d167b99634717acdaf5e299f/orjson-3.11.5-cp313-cp313-win_arm64.whl", hash = "sha256:59ac72ea775c88b163ba8d21b0177628bd015c5dd060647bbab6e22da3aad287", size = 126712, upload-time = "2025-12-06T15:54:59.892Z" }, + { url = "https://files.pythonhosted.org/packages/c2/60/77d7b839e317ead7bb225d55bb50f7ea75f47afc489c81199befc5435b50/orjson-3.11.5-cp314-cp314-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:e446a8ea0a4c366ceafc7d97067bfd55292969143b57e3c846d87fc701e797a0", size = 245252, upload-time = "2025-12-06T15:55:01.127Z" }, + { url = "https://files.pythonhosted.org/packages/f1/aa/d4639163b400f8044cef0fb9aa51b0337be0da3a27187a20d1166e742370/orjson-3.11.5-cp314-cp314-macosx_15_0_arm64.whl", hash = "sha256:53deb5addae9c22bbe3739298f5f2196afa881ea75944e7720681c7080909a81", size = 129419, upload-time = "2025-12-06T15:55:02.723Z" }, + { url = "https://files.pythonhosted.org/packages/30/94/9eabf94f2e11c671111139edf5ec410d2f21e6feee717804f7e8872d883f/orjson-3.11.5-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:82cd00d49d6063d2b8791da5d4f9d20539c5951f965e45ccf4e96d33505ce68f", size = 132050, upload-time = "2025-12-06T15:55:03.918Z" }, + { url = "https://files.pythonhosted.org/packages/3d/c8/ca10f5c5322f341ea9a9f1097e140be17a88f88d1cfdd29df522970d9744/orjson-3.11.5-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3fd15f9fc8c203aeceff4fda211157fad114dde66e92e24097b3647a08f4ee9e", size = 130370, upload-time = "2025-12-06T15:55:05.173Z" }, + { url = "https://files.pythonhosted.org/packages/25/d4/e96824476d361ee2edd5c6290ceb8d7edf88d81148a6ce172fc00278ca7f/orjson-3.11.5-cp314-cp314-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9df95000fbe6777bf9820ae82ab7578e8662051bb5f83d71a28992f539d2cda7", size = 136012, upload-time = "2025-12-06T15:55:06.402Z" }, + { url = "https://files.pythonhosted.org/packages/85/8e/9bc3423308c425c588903f2d103cfcfe2539e07a25d6522900645a6f257f/orjson-3.11.5-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:92a8d676748fca47ade5bc3da7430ed7767afe51b2f8100e3cd65e151c0eaceb", size = 139809, upload-time = "2025-12-06T15:55:07.656Z" }, + { url = "https://files.pythonhosted.org/packages/e9/3c/b404e94e0b02a232b957c54643ce68d0268dacb67ac33ffdee24008c8b27/orjson-3.11.5-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:aa0f513be38b40234c77975e68805506cad5d57b3dfd8fe3baa7f4f4051e15b4", size = 137332, upload-time = "2025-12-06T15:55:08.961Z" }, + { url = "https://files.pythonhosted.org/packages/51/30/cc2d69d5ce0ad9b84811cdf4a0cd5362ac27205a921da524ff42f26d65e0/orjson-3.11.5-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fa1863e75b92891f553b7922ce4ee10ed06db061e104f2b7815de80cdcb135ad", size = 138983, upload-time = "2025-12-06T15:55:10.595Z" }, + { url = "https://files.pythonhosted.org/packages/0e/87/de3223944a3e297d4707d2fe3b1ffb71437550e165eaf0ca8bbe43ccbcb1/orjson-3.11.5-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:d4be86b58e9ea262617b8ca6251a2f0d63cc132a6da4b5fcc8e0a4128782c829", size = 141069, upload-time = "2025-12-06T15:55:11.832Z" }, + { url = "https://files.pythonhosted.org/packages/65/30/81d5087ae74be33bcae3ff2d80f5ccaa4a8fedc6d39bf65a427a95b8977f/orjson-3.11.5-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:b923c1c13fa02084eb38c9c065afd860a5cff58026813319a06949c3af5732ac", size = 413491, upload-time = "2025-12-06T15:55:13.314Z" }, + { url = "https://files.pythonhosted.org/packages/d0/6f/f6058c21e2fc1efaf918986dbc2da5cd38044f1a2d4b7b91ad17c4acf786/orjson-3.11.5-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:1b6bd351202b2cd987f35a13b5e16471cf4d952b42a73c391cc537974c43ef6d", size = 151375, upload-time = "2025-12-06T15:55:14.715Z" }, + { url = "https://files.pythonhosted.org/packages/54/92/c6921f17d45e110892899a7a563a925b2273d929959ce2ad89e2525b885b/orjson-3.11.5-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:bb150d529637d541e6af06bbe3d02f5498d628b7f98267ff87647584293ab439", size = 141850, upload-time = "2025-12-06T15:55:15.94Z" }, + { url = "https://files.pythonhosted.org/packages/88/86/cdecb0140a05e1a477b81f24739da93b25070ee01ce7f7242f44a6437594/orjson-3.11.5-cp314-cp314-win32.whl", hash = "sha256:9cc1e55c884921434a84a0c3dd2699eb9f92e7b441d7f53f3941079ec6ce7499", size = 135278, upload-time = "2025-12-06T15:55:17.202Z" }, + { url = "https://files.pythonhosted.org/packages/e4/97/b638d69b1e947d24f6109216997e38922d54dcdcdb1b11c18d7efd2d3c59/orjson-3.11.5-cp314-cp314-win_amd64.whl", hash = "sha256:a4f3cb2d874e03bc7767c8f88adaa1a9a05cecea3712649c3b58589ec7317310", size = 133170, upload-time = "2025-12-06T15:55:18.468Z" }, + { url = "https://files.pythonhosted.org/packages/8f/dd/f4fff4a6fe601b4f8f3ba3aa6da8ac33d17d124491a3b804c662a70e1636/orjson-3.11.5-cp314-cp314-win_arm64.whl", hash = "sha256:38b22f476c351f9a1c43e5b07d8b5a02eb24a6ab8e75f700f7d479d4568346a5", size = 126713, upload-time = "2025-12-06T15:55:19.738Z" }, ] [[package]] @@ -3367,77 +3400,80 @@ wheels = [ [[package]] name = "pillow" -version = "11.3.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/f3/0d/d0d6dea55cd152ce3d6767bb38a8fc10e33796ba4ba210cbab9354b6d238/pillow-11.3.0.tar.gz", hash = "sha256:3828ee7586cd0b2091b6209e5ad53e20d0649bbe87164a459d0676e035e8f523", size = 47113069, upload-time = "2025-07-01T09:16:30.666Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/40/fe/1bc9b3ee13f68487a99ac9529968035cca2f0a51ec36892060edcc51d06a/pillow-11.3.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:fdae223722da47b024b867c1ea0be64e0df702c5e0a60e27daad39bf960dd1e4", size = 5278800, upload-time = "2025-07-01T09:14:17.648Z" }, - { url = "https://files.pythonhosted.org/packages/2c/32/7e2ac19b5713657384cec55f89065fb306b06af008cfd87e572035b27119/pillow-11.3.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:921bd305b10e82b4d1f5e802b6850677f965d8394203d182f078873851dada69", size = 4686296, upload-time = "2025-07-01T09:14:19.828Z" }, - { url = "https://files.pythonhosted.org/packages/8e/1e/b9e12bbe6e4c2220effebc09ea0923a07a6da1e1f1bfbc8d7d29a01ce32b/pillow-11.3.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:eb76541cba2f958032d79d143b98a3a6b3ea87f0959bbe256c0b5e416599fd5d", size = 5871726, upload-time = "2025-07-03T13:10:04.448Z" }, - { url = "https://files.pythonhosted.org/packages/8d/33/e9200d2bd7ba00dc3ddb78df1198a6e80d7669cce6c2bdbeb2530a74ec58/pillow-11.3.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:67172f2944ebba3d4a7b54f2e95c786a3a50c21b88456329314caaa28cda70f6", size = 7644652, upload-time = "2025-07-03T13:10:10.391Z" }, - { url = "https://files.pythonhosted.org/packages/41/f1/6f2427a26fc683e00d985bc391bdd76d8dd4e92fac33d841127eb8fb2313/pillow-11.3.0-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:97f07ed9f56a3b9b5f49d3661dc9607484e85c67e27f3e8be2c7d28ca032fec7", size = 5977787, upload-time = "2025-07-01T09:14:21.63Z" }, - { url = "https://files.pythonhosted.org/packages/e4/c9/06dd4a38974e24f932ff5f98ea3c546ce3f8c995d3f0985f8e5ba48bba19/pillow-11.3.0-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:676b2815362456b5b3216b4fd5bd89d362100dc6f4945154ff172e206a22c024", size = 6645236, upload-time = "2025-07-01T09:14:23.321Z" }, - { url = "https://files.pythonhosted.org/packages/40/e7/848f69fb79843b3d91241bad658e9c14f39a32f71a301bcd1d139416d1be/pillow-11.3.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3e184b2f26ff146363dd07bde8b711833d7b0202e27d13540bfe2e35a323a809", size = 6086950, upload-time = "2025-07-01T09:14:25.237Z" }, - { url = "https://files.pythonhosted.org/packages/0b/1a/7cff92e695a2a29ac1958c2a0fe4c0b2393b60aac13b04a4fe2735cad52d/pillow-11.3.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:6be31e3fc9a621e071bc17bb7de63b85cbe0bfae91bb0363c893cbe67247780d", size = 6723358, upload-time = "2025-07-01T09:14:27.053Z" }, - { url = "https://files.pythonhosted.org/packages/26/7d/73699ad77895f69edff76b0f332acc3d497f22f5d75e5360f78cbcaff248/pillow-11.3.0-cp312-cp312-win32.whl", hash = "sha256:7b161756381f0918e05e7cb8a371fff367e807770f8fe92ecb20d905d0e1c149", size = 6275079, upload-time = "2025-07-01T09:14:30.104Z" }, - { url = "https://files.pythonhosted.org/packages/8c/ce/e7dfc873bdd9828f3b6e5c2bbb74e47a98ec23cc5c74fc4e54462f0d9204/pillow-11.3.0-cp312-cp312-win_amd64.whl", hash = "sha256:a6444696fce635783440b7f7a9fc24b3ad10a9ea3f0ab66c5905be1c19ccf17d", size = 6986324, upload-time = "2025-07-01T09:14:31.899Z" }, - { url = "https://files.pythonhosted.org/packages/16/8f/b13447d1bf0b1f7467ce7d86f6e6edf66c0ad7cf44cf5c87a37f9bed9936/pillow-11.3.0-cp312-cp312-win_arm64.whl", hash = "sha256:2aceea54f957dd4448264f9bf40875da0415c83eb85f55069d89c0ed436e3542", size = 2423067, upload-time = "2025-07-01T09:14:33.709Z" }, - { url = "https://files.pythonhosted.org/packages/1e/93/0952f2ed8db3a5a4c7a11f91965d6184ebc8cd7cbb7941a260d5f018cd2d/pillow-11.3.0-cp313-cp313-ios_13_0_arm64_iphoneos.whl", hash = "sha256:1c627742b539bba4309df89171356fcb3cc5a9178355b2727d1b74a6cf155fbd", size = 2128328, upload-time = "2025-07-01T09:14:35.276Z" }, - { url = "https://files.pythonhosted.org/packages/4b/e8/100c3d114b1a0bf4042f27e0f87d2f25e857e838034e98ca98fe7b8c0a9c/pillow-11.3.0-cp313-cp313-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:30b7c02f3899d10f13d7a48163c8969e4e653f8b43416d23d13d1bbfdc93b9f8", size = 2170652, upload-time = "2025-07-01T09:14:37.203Z" }, - { url = "https://files.pythonhosted.org/packages/aa/86/3f758a28a6e381758545f7cdb4942e1cb79abd271bea932998fc0db93cb6/pillow-11.3.0-cp313-cp313-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:7859a4cc7c9295f5838015d8cc0a9c215b77e43d07a25e460f35cf516df8626f", size = 2227443, upload-time = "2025-07-01T09:14:39.344Z" }, - { url = "https://files.pythonhosted.org/packages/01/f4/91d5b3ffa718df2f53b0dc109877993e511f4fd055d7e9508682e8aba092/pillow-11.3.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:ec1ee50470b0d050984394423d96325b744d55c701a439d2bd66089bff963d3c", size = 5278474, upload-time = "2025-07-01T09:14:41.843Z" }, - { url = "https://files.pythonhosted.org/packages/f9/0e/37d7d3eca6c879fbd9dba21268427dffda1ab00d4eb05b32923d4fbe3b12/pillow-11.3.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:7db51d222548ccfd274e4572fdbf3e810a5e66b00608862f947b163e613b67dd", size = 4686038, upload-time = "2025-07-01T09:14:44.008Z" }, - { url = "https://files.pythonhosted.org/packages/ff/b0/3426e5c7f6565e752d81221af9d3676fdbb4f352317ceafd42899aaf5d8a/pillow-11.3.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:2d6fcc902a24ac74495df63faad1884282239265c6839a0a6416d33faedfae7e", size = 5864407, upload-time = "2025-07-03T13:10:15.628Z" }, - { url = "https://files.pythonhosted.org/packages/fc/c1/c6c423134229f2a221ee53f838d4be9d82bab86f7e2f8e75e47b6bf6cd77/pillow-11.3.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f0f5d8f4a08090c6d6d578351a2b91acf519a54986c055af27e7a93feae6d3f1", size = 7639094, upload-time = "2025-07-03T13:10:21.857Z" }, - { url = "https://files.pythonhosted.org/packages/ba/c9/09e6746630fe6372c67c648ff9deae52a2bc20897d51fa293571977ceb5d/pillow-11.3.0-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c37d8ba9411d6003bba9e518db0db0c58a680ab9fe5179f040b0463644bc9805", size = 5973503, upload-time = "2025-07-01T09:14:45.698Z" }, - { url = "https://files.pythonhosted.org/packages/d5/1c/a2a29649c0b1983d3ef57ee87a66487fdeb45132df66ab30dd37f7dbe162/pillow-11.3.0-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:13f87d581e71d9189ab21fe0efb5a23e9f28552d5be6979e84001d3b8505abe8", size = 6642574, upload-time = "2025-07-01T09:14:47.415Z" }, - { url = "https://files.pythonhosted.org/packages/36/de/d5cc31cc4b055b6c6fd990e3e7f0f8aaf36229a2698501bcb0cdf67c7146/pillow-11.3.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:023f6d2d11784a465f09fd09a34b150ea4672e85fb3d05931d89f373ab14abb2", size = 6084060, upload-time = "2025-07-01T09:14:49.636Z" }, - { url = "https://files.pythonhosted.org/packages/d5/ea/502d938cbaeec836ac28a9b730193716f0114c41325db428e6b280513f09/pillow-11.3.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:45dfc51ac5975b938e9809451c51734124e73b04d0f0ac621649821a63852e7b", size = 6721407, upload-time = "2025-07-01T09:14:51.962Z" }, - { url = "https://files.pythonhosted.org/packages/45/9c/9c5e2a73f125f6cbc59cc7087c8f2d649a7ae453f83bd0362ff7c9e2aee2/pillow-11.3.0-cp313-cp313-win32.whl", hash = "sha256:a4d336baed65d50d37b88ca5b60c0fa9d81e3a87d4a7930d3880d1624d5b31f3", size = 6273841, upload-time = "2025-07-01T09:14:54.142Z" }, - { url = "https://files.pythonhosted.org/packages/23/85/397c73524e0cd212067e0c969aa245b01d50183439550d24d9f55781b776/pillow-11.3.0-cp313-cp313-win_amd64.whl", hash = "sha256:0bce5c4fd0921f99d2e858dc4d4d64193407e1b99478bc5cacecba2311abde51", size = 6978450, upload-time = "2025-07-01T09:14:56.436Z" }, - { url = "https://files.pythonhosted.org/packages/17/d2/622f4547f69cd173955194b78e4d19ca4935a1b0f03a302d655c9f6aae65/pillow-11.3.0-cp313-cp313-win_arm64.whl", hash = "sha256:1904e1264881f682f02b7f8167935cce37bc97db457f8e7849dc3a6a52b99580", size = 2423055, upload-time = "2025-07-01T09:14:58.072Z" }, - { url = "https://files.pythonhosted.org/packages/dd/80/a8a2ac21dda2e82480852978416cfacd439a4b490a501a288ecf4fe2532d/pillow-11.3.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:4c834a3921375c48ee6b9624061076bc0a32a60b5532b322cc0ea64e639dd50e", size = 5281110, upload-time = "2025-07-01T09:14:59.79Z" }, - { url = "https://files.pythonhosted.org/packages/44/d6/b79754ca790f315918732e18f82a8146d33bcd7f4494380457ea89eb883d/pillow-11.3.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:5e05688ccef30ea69b9317a9ead994b93975104a677a36a8ed8106be9260aa6d", size = 4689547, upload-time = "2025-07-01T09:15:01.648Z" }, - { url = "https://files.pythonhosted.org/packages/49/20/716b8717d331150cb00f7fdd78169c01e8e0c219732a78b0e59b6bdb2fd6/pillow-11.3.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:1019b04af07fc0163e2810167918cb5add8d74674b6267616021ab558dc98ced", size = 5901554, upload-time = "2025-07-03T13:10:27.018Z" }, - { url = "https://files.pythonhosted.org/packages/74/cf/a9f3a2514a65bb071075063a96f0a5cf949c2f2fce683c15ccc83b1c1cab/pillow-11.3.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f944255db153ebb2b19c51fe85dd99ef0ce494123f21b9db4877ffdfc5590c7c", size = 7669132, upload-time = "2025-07-03T13:10:33.01Z" }, - { url = "https://files.pythonhosted.org/packages/98/3c/da78805cbdbee9cb43efe8261dd7cc0b4b93f2ac79b676c03159e9db2187/pillow-11.3.0-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1f85acb69adf2aaee8b7da124efebbdb959a104db34d3a2cb0f3793dbae422a8", size = 6005001, upload-time = "2025-07-01T09:15:03.365Z" }, - { url = "https://files.pythonhosted.org/packages/6c/fa/ce044b91faecf30e635321351bba32bab5a7e034c60187fe9698191aef4f/pillow-11.3.0-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:05f6ecbeff5005399bb48d198f098a9b4b6bdf27b8487c7f38ca16eeb070cd59", size = 6668814, upload-time = "2025-07-01T09:15:05.655Z" }, - { url = "https://files.pythonhosted.org/packages/7b/51/90f9291406d09bf93686434f9183aba27b831c10c87746ff49f127ee80cb/pillow-11.3.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:a7bc6e6fd0395bc052f16b1a8670859964dbd7003bd0af2ff08342eb6e442cfe", size = 6113124, upload-time = "2025-07-01T09:15:07.358Z" }, - { url = "https://files.pythonhosted.org/packages/cd/5a/6fec59b1dfb619234f7636d4157d11fb4e196caeee220232a8d2ec48488d/pillow-11.3.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:83e1b0161c9d148125083a35c1c5a89db5b7054834fd4387499e06552035236c", size = 6747186, upload-time = "2025-07-01T09:15:09.317Z" }, - { url = "https://files.pythonhosted.org/packages/49/6b/00187a044f98255225f172de653941e61da37104a9ea60e4f6887717e2b5/pillow-11.3.0-cp313-cp313t-win32.whl", hash = "sha256:2a3117c06b8fb646639dce83694f2f9eac405472713fcb1ae887469c0d4f6788", size = 6277546, upload-time = "2025-07-01T09:15:11.311Z" }, - { url = "https://files.pythonhosted.org/packages/e8/5c/6caaba7e261c0d75bab23be79f1d06b5ad2a2ae49f028ccec801b0e853d6/pillow-11.3.0-cp313-cp313t-win_amd64.whl", hash = "sha256:857844335c95bea93fb39e0fa2726b4d9d758850b34075a7e3ff4f4fa3aa3b31", size = 6985102, upload-time = "2025-07-01T09:15:13.164Z" }, - { url = "https://files.pythonhosted.org/packages/f3/7e/b623008460c09a0cb38263c93b828c666493caee2eb34ff67f778b87e58c/pillow-11.3.0-cp313-cp313t-win_arm64.whl", hash = "sha256:8797edc41f3e8536ae4b10897ee2f637235c94f27404cac7297f7b607dd0716e", size = 2424803, upload-time = "2025-07-01T09:15:15.695Z" }, - { url = "https://files.pythonhosted.org/packages/73/f4/04905af42837292ed86cb1b1dabe03dce1edc008ef14c473c5c7e1443c5d/pillow-11.3.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:d9da3df5f9ea2a89b81bb6087177fb1f4d1c7146d583a3fe5c672c0d94e55e12", size = 5278520, upload-time = "2025-07-01T09:15:17.429Z" }, - { url = "https://files.pythonhosted.org/packages/41/b0/33d79e377a336247df6348a54e6d2a2b85d644ca202555e3faa0cf811ecc/pillow-11.3.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:0b275ff9b04df7b640c59ec5a3cb113eefd3795a8df80bac69646ef699c6981a", size = 4686116, upload-time = "2025-07-01T09:15:19.423Z" }, - { url = "https://files.pythonhosted.org/packages/49/2d/ed8bc0ab219ae8768f529597d9509d184fe8a6c4741a6864fea334d25f3f/pillow-11.3.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:0743841cabd3dba6a83f38a92672cccbd69af56e3e91777b0ee7f4dba4385632", size = 5864597, upload-time = "2025-07-03T13:10:38.404Z" }, - { url = "https://files.pythonhosted.org/packages/b5/3d/b932bb4225c80b58dfadaca9d42d08d0b7064d2d1791b6a237f87f661834/pillow-11.3.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:2465a69cf967b8b49ee1b96d76718cd98c4e925414ead59fdf75cf0fd07df673", size = 7638246, upload-time = "2025-07-03T13:10:44.987Z" }, - { url = "https://files.pythonhosted.org/packages/09/b5/0487044b7c096f1b48f0d7ad416472c02e0e4bf6919541b111efd3cae690/pillow-11.3.0-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:41742638139424703b4d01665b807c6468e23e699e8e90cffefe291c5832b027", size = 5973336, upload-time = "2025-07-01T09:15:21.237Z" }, - { url = "https://files.pythonhosted.org/packages/a8/2d/524f9318f6cbfcc79fbc004801ea6b607ec3f843977652fdee4857a7568b/pillow-11.3.0-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:93efb0b4de7e340d99057415c749175e24c8864302369e05914682ba642e5d77", size = 6642699, upload-time = "2025-07-01T09:15:23.186Z" }, - { url = "https://files.pythonhosted.org/packages/6f/d2/a9a4f280c6aefedce1e8f615baaa5474e0701d86dd6f1dede66726462bbd/pillow-11.3.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7966e38dcd0fa11ca390aed7c6f20454443581d758242023cf36fcb319b1a874", size = 6083789, upload-time = "2025-07-01T09:15:25.1Z" }, - { url = "https://files.pythonhosted.org/packages/fe/54/86b0cd9dbb683a9d5e960b66c7379e821a19be4ac5810e2e5a715c09a0c0/pillow-11.3.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:98a9afa7b9007c67ed84c57c9e0ad86a6000da96eaa638e4f8abe5b65ff83f0a", size = 6720386, upload-time = "2025-07-01T09:15:27.378Z" }, - { url = "https://files.pythonhosted.org/packages/e7/95/88efcaf384c3588e24259c4203b909cbe3e3c2d887af9e938c2022c9dd48/pillow-11.3.0-cp314-cp314-win32.whl", hash = "sha256:02a723e6bf909e7cea0dac1b0e0310be9d7650cd66222a5f1c571455c0a45214", size = 6370911, upload-time = "2025-07-01T09:15:29.294Z" }, - { url = "https://files.pythonhosted.org/packages/2e/cc/934e5820850ec5eb107e7b1a72dd278140731c669f396110ebc326f2a503/pillow-11.3.0-cp314-cp314-win_amd64.whl", hash = "sha256:a418486160228f64dd9e9efcd132679b7a02a5f22c982c78b6fc7dab3fefb635", size = 7117383, upload-time = "2025-07-01T09:15:31.128Z" }, - { url = "https://files.pythonhosted.org/packages/d6/e9/9c0a616a71da2a5d163aa37405e8aced9a906d574b4a214bede134e731bc/pillow-11.3.0-cp314-cp314-win_arm64.whl", hash = "sha256:155658efb5e044669c08896c0c44231c5e9abcaadbc5cd3648df2f7c0b96b9a6", size = 2511385, upload-time = "2025-07-01T09:15:33.328Z" }, - { url = "https://files.pythonhosted.org/packages/1a/33/c88376898aff369658b225262cd4f2659b13e8178e7534df9e6e1fa289f6/pillow-11.3.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:59a03cdf019efbfeeed910bf79c7c93255c3d54bc45898ac2a4140071b02b4ae", size = 5281129, upload-time = "2025-07-01T09:15:35.194Z" }, - { url = "https://files.pythonhosted.org/packages/1f/70/d376247fb36f1844b42910911c83a02d5544ebd2a8bad9efcc0f707ea774/pillow-11.3.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:f8a5827f84d973d8636e9dc5764af4f0cf2318d26744b3d902931701b0d46653", size = 4689580, upload-time = "2025-07-01T09:15:37.114Z" }, - { url = "https://files.pythonhosted.org/packages/eb/1c/537e930496149fbac69efd2fc4329035bbe2e5475b4165439e3be9cb183b/pillow-11.3.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:ee92f2fd10f4adc4b43d07ec5e779932b4eb3dbfbc34790ada5a6669bc095aa6", size = 5902860, upload-time = "2025-07-03T13:10:50.248Z" }, - { url = "https://files.pythonhosted.org/packages/bd/57/80f53264954dcefeebcf9dae6e3eb1daea1b488f0be8b8fef12f79a3eb10/pillow-11.3.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c96d333dcf42d01f47b37e0979b6bd73ec91eae18614864622d9b87bbd5bbf36", size = 7670694, upload-time = "2025-07-03T13:10:56.432Z" }, - { url = "https://files.pythonhosted.org/packages/70/ff/4727d3b71a8578b4587d9c276e90efad2d6fe0335fd76742a6da08132e8c/pillow-11.3.0-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4c96f993ab8c98460cd0c001447bff6194403e8b1d7e149ade5f00594918128b", size = 6005888, upload-time = "2025-07-01T09:15:39.436Z" }, - { url = "https://files.pythonhosted.org/packages/05/ae/716592277934f85d3be51d7256f3636672d7b1abfafdc42cf3f8cbd4b4c8/pillow-11.3.0-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:41342b64afeba938edb034d122b2dda5db2139b9a4af999729ba8818e0056477", size = 6670330, upload-time = "2025-07-01T09:15:41.269Z" }, - { url = "https://files.pythonhosted.org/packages/e7/bb/7fe6cddcc8827b01b1a9766f5fdeb7418680744f9082035bdbabecf1d57f/pillow-11.3.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:068d9c39a2d1b358eb9f245ce7ab1b5c3246c7c8c7d9ba58cfa5b43146c06e50", size = 6114089, upload-time = "2025-07-01T09:15:43.13Z" }, - { url = "https://files.pythonhosted.org/packages/8b/f5/06bfaa444c8e80f1a8e4bff98da9c83b37b5be3b1deaa43d27a0db37ef84/pillow-11.3.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:a1bc6ba083b145187f648b667e05a2534ecc4b9f2784c2cbe3089e44868f2b9b", size = 6748206, upload-time = "2025-07-01T09:15:44.937Z" }, - { url = "https://files.pythonhosted.org/packages/f0/77/bc6f92a3e8e6e46c0ca78abfffec0037845800ea38c73483760362804c41/pillow-11.3.0-cp314-cp314t-win32.whl", hash = "sha256:118ca10c0d60b06d006be10a501fd6bbdfef559251ed31b794668ed569c87e12", size = 6377370, upload-time = "2025-07-01T09:15:46.673Z" }, - { url = "https://files.pythonhosted.org/packages/4a/82/3a721f7d69dca802befb8af08b7c79ebcab461007ce1c18bd91a5d5896f9/pillow-11.3.0-cp314-cp314t-win_amd64.whl", hash = "sha256:8924748b688aa210d79883357d102cd64690e56b923a186f35a82cbc10f997db", size = 7121500, upload-time = "2025-07-01T09:15:48.512Z" }, - { url = "https://files.pythonhosted.org/packages/89/c7/5572fa4a3f45740eaab6ae86fcdf7195b55beac1371ac8c619d880cfe948/pillow-11.3.0-cp314-cp314t-win_arm64.whl", hash = "sha256:79ea0d14d3ebad43ec77ad5272e6ff9bba5b679ef73375ea760261207fa8e0aa", size = 2512835, upload-time = "2025-07-01T09:15:50.399Z" }, +version = "12.1.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/1f/42/5c74462b4fd957fcd7b13b04fb3205ff8349236ea74c7c375766d6c82288/pillow-12.1.1.tar.gz", hash = "sha256:9ad8fa5937ab05218e2b6a4cff30295ad35afd2f83ac592e68c0d871bb0fdbc4", size = 46980264, upload-time = "2026-02-11T04:23:07.146Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/07/d3/8df65da0d4df36b094351dce696f2989bec731d4f10e743b1c5f4da4d3bf/pillow-12.1.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:ab323b787d6e18b3d91a72fc99b1a2c28651e4358749842b8f8dfacd28ef2052", size = 5262803, upload-time = "2026-02-11T04:20:47.653Z" }, + { url = "https://files.pythonhosted.org/packages/d6/71/5026395b290ff404b836e636f51d7297e6c83beceaa87c592718747e670f/pillow-12.1.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:adebb5bee0f0af4909c30db0d890c773d1a92ffe83da908e2e9e720f8edf3984", size = 4657601, upload-time = "2026-02-11T04:20:49.328Z" }, + { url = "https://files.pythonhosted.org/packages/b1/2e/1001613d941c67442f745aff0f7cc66dd8df9a9c084eb497e6a543ee6f7e/pillow-12.1.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:bb66b7cc26f50977108790e2456b7921e773f23db5630261102233eb355a3b79", size = 6234995, upload-time = "2026-02-11T04:20:51.032Z" }, + { url = "https://files.pythonhosted.org/packages/07/26/246ab11455b2549b9233dbd44d358d033a2f780fa9007b61a913c5b2d24e/pillow-12.1.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:aee2810642b2898bb187ced9b349e95d2a7272930796e022efaf12e99dccd293", size = 8045012, upload-time = "2026-02-11T04:20:52.882Z" }, + { url = "https://files.pythonhosted.org/packages/b2/8b/07587069c27be7535ac1fe33874e32de118fbd34e2a73b7f83436a88368c/pillow-12.1.1-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a0b1cd6232e2b618adcc54d9882e4e662a089d5768cd188f7c245b4c8c44a397", size = 6349638, upload-time = "2026-02-11T04:20:54.444Z" }, + { url = "https://files.pythonhosted.org/packages/ff/79/6df7b2ee763d619cda2fb4fea498e5f79d984dae304d45a8999b80d6cf5c/pillow-12.1.1-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7aac39bcf8d4770d089588a2e1dd111cbaa42df5a94be3114222057d68336bd0", size = 7041540, upload-time = "2026-02-11T04:20:55.97Z" }, + { url = "https://files.pythonhosted.org/packages/2c/5e/2ba19e7e7236d7529f4d873bdaf317a318896bac289abebd4bb00ef247f0/pillow-12.1.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:ab174cd7d29a62dd139c44bf74b698039328f45cb03b4596c43473a46656b2f3", size = 6462613, upload-time = "2026-02-11T04:20:57.542Z" }, + { url = "https://files.pythonhosted.org/packages/03/03/31216ec124bb5c3dacd74ce8efff4cc7f52643653bad4825f8f08c697743/pillow-12.1.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:339ffdcb7cbeaa08221cd401d517d4b1fe7a9ed5d400e4a8039719238620ca35", size = 7166745, upload-time = "2026-02-11T04:20:59.196Z" }, + { url = "https://files.pythonhosted.org/packages/1f/e7/7c4552d80052337eb28653b617eafdef39adfb137c49dd7e831b8dc13bc5/pillow-12.1.1-cp312-cp312-win32.whl", hash = "sha256:5d1f9575a12bed9e9eedd9a4972834b08c97a352bd17955ccdebfeca5913fa0a", size = 6328823, upload-time = "2026-02-11T04:21:01.385Z" }, + { url = "https://files.pythonhosted.org/packages/3d/17/688626d192d7261bbbf98846fc98995726bddc2c945344b65bec3a29d731/pillow-12.1.1-cp312-cp312-win_amd64.whl", hash = "sha256:21329ec8c96c6e979cd0dfd29406c40c1d52521a90544463057d2aaa937d66a6", size = 7033367, upload-time = "2026-02-11T04:21:03.536Z" }, + { url = "https://files.pythonhosted.org/packages/ed/fe/a0ef1f73f939b0eca03ee2c108d0043a87468664770612602c63266a43c4/pillow-12.1.1-cp312-cp312-win_arm64.whl", hash = "sha256:af9a332e572978f0218686636610555ae3defd1633597be015ed50289a03c523", size = 2453811, upload-time = "2026-02-11T04:21:05.116Z" }, + { url = "https://files.pythonhosted.org/packages/d5/11/6db24d4bd7685583caeae54b7009584e38da3c3d4488ed4cd25b439de486/pillow-12.1.1-cp313-cp313-ios_13_0_arm64_iphoneos.whl", hash = "sha256:d242e8ac078781f1de88bf823d70c1a9b3c7950a44cdf4b7c012e22ccbcd8e4e", size = 4062689, upload-time = "2026-02-11T04:21:06.804Z" }, + { url = "https://files.pythonhosted.org/packages/33/c0/ce6d3b1fe190f0021203e0d9b5b99e57843e345f15f9ef22fcd43842fd21/pillow-12.1.1-cp313-cp313-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:02f84dfad02693676692746df05b89cf25597560db2857363a208e393429f5e9", size = 4138535, upload-time = "2026-02-11T04:21:08.452Z" }, + { url = "https://files.pythonhosted.org/packages/a0/c6/d5eb6a4fb32a3f9c21a8c7613ec706534ea1cf9f4b3663e99f0d83f6fca8/pillow-12.1.1-cp313-cp313-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:e65498daf4b583091ccbb2556c7000abf0f3349fcd57ef7adc9a84a394ed29f6", size = 3601364, upload-time = "2026-02-11T04:21:10.194Z" }, + { url = "https://files.pythonhosted.org/packages/14/a1/16c4b823838ba4c9c52c0e6bbda903a3fe5a1bdbf1b8eb4fff7156f3e318/pillow-12.1.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:6c6db3b84c87d48d0088943bf33440e0c42370b99b1c2a7989216f7b42eede60", size = 5262561, upload-time = "2026-02-11T04:21:11.742Z" }, + { url = "https://files.pythonhosted.org/packages/bb/ad/ad9dc98ff24f485008aa5cdedaf1a219876f6f6c42a4626c08bc4e80b120/pillow-12.1.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:8b7e5304e34942bf62e15184219a7b5ad4ff7f3bb5cca4d984f37df1a0e1aee2", size = 4657460, upload-time = "2026-02-11T04:21:13.786Z" }, + { url = "https://files.pythonhosted.org/packages/9e/1b/f1a4ea9a895b5732152789326202a82464d5254759fbacae4deea3069334/pillow-12.1.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:18e5bddd742a44b7e6b1e773ab5db102bd7a94c32555ba656e76d319d19c3850", size = 6232698, upload-time = "2026-02-11T04:21:15.949Z" }, + { url = "https://files.pythonhosted.org/packages/95/f4/86f51b8745070daf21fd2e5b1fe0eb35d4db9ca26e6d58366562fb56a743/pillow-12.1.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:fc44ef1f3de4f45b50ccf9136999d71abb99dca7706bc75d222ed350b9fd2289", size = 8041706, upload-time = "2026-02-11T04:21:17.723Z" }, + { url = "https://files.pythonhosted.org/packages/29/9b/d6ecd956bb1266dd1045e995cce9b8d77759e740953a1c9aad9502a0461e/pillow-12.1.1-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5a8eb7ed8d4198bccbd07058416eeec51686b498e784eda166395a23eb99138e", size = 6346621, upload-time = "2026-02-11T04:21:19.547Z" }, + { url = "https://files.pythonhosted.org/packages/71/24/538bff45bde96535d7d998c6fed1a751c75ac7c53c37c90dc2601b243893/pillow-12.1.1-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:47b94983da0c642de92ced1702c5b6c292a84bd3a8e1d1702ff923f183594717", size = 7038069, upload-time = "2026-02-11T04:21:21.378Z" }, + { url = "https://files.pythonhosted.org/packages/94/0e/58cb1a6bc48f746bc4cb3adb8cabff73e2742c92b3bf7a220b7cf69b9177/pillow-12.1.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:518a48c2aab7ce596d3bf79d0e275661b846e86e4d0e7dec34712c30fe07f02a", size = 6460040, upload-time = "2026-02-11T04:21:23.148Z" }, + { url = "https://files.pythonhosted.org/packages/6c/57/9045cb3ff11eeb6c1adce3b2d60d7d299d7b273a2e6c8381a524abfdc474/pillow-12.1.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a550ae29b95c6dc13cf69e2c9dc5747f814c54eeb2e32d683e5e93af56caa029", size = 7164523, upload-time = "2026-02-11T04:21:25.01Z" }, + { url = "https://files.pythonhosted.org/packages/73/f2/9be9cb99f2175f0d4dbadd6616ce1bf068ee54a28277ea1bf1fbf729c250/pillow-12.1.1-cp313-cp313-win32.whl", hash = "sha256:a003d7422449f6d1e3a34e3dd4110c22148336918ddbfc6a32581cd54b2e0b2b", size = 6332552, upload-time = "2026-02-11T04:21:27.238Z" }, + { url = "https://files.pythonhosted.org/packages/3f/eb/b0834ad8b583d7d9d42b80becff092082a1c3c156bb582590fcc973f1c7c/pillow-12.1.1-cp313-cp313-win_amd64.whl", hash = "sha256:344cf1e3dab3be4b1fa08e449323d98a2a3f819ad20f4b22e77a0ede31f0faa1", size = 7040108, upload-time = "2026-02-11T04:21:29.462Z" }, + { url = "https://files.pythonhosted.org/packages/d5/7d/fc09634e2aabdd0feabaff4a32f4a7d97789223e7c2042fd805ea4b4d2c2/pillow-12.1.1-cp313-cp313-win_arm64.whl", hash = "sha256:5c0dd1636633e7e6a0afe7bf6a51a14992b7f8e60de5789018ebbdfae55b040a", size = 2453712, upload-time = "2026-02-11T04:21:31.072Z" }, + { url = "https://files.pythonhosted.org/packages/19/2a/b9d62794fc8a0dd14c1943df68347badbd5511103e0d04c035ffe5cf2255/pillow-12.1.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:0330d233c1a0ead844fc097a7d16c0abff4c12e856c0b325f231820fee1f39da", size = 5264880, upload-time = "2026-02-11T04:21:32.865Z" }, + { url = "https://files.pythonhosted.org/packages/26/9d/e03d857d1347fa5ed9247e123fcd2a97b6220e15e9cb73ca0a8d91702c6e/pillow-12.1.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:5dae5f21afb91322f2ff791895ddd8889e5e947ff59f71b46041c8ce6db790bc", size = 4660616, upload-time = "2026-02-11T04:21:34.97Z" }, + { url = "https://files.pythonhosted.org/packages/f7/ec/8a6d22afd02570d30954e043f09c32772bfe143ba9285e2fdb11284952cd/pillow-12.1.1-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:2e0c664be47252947d870ac0d327fea7e63985a08794758aa8af5b6cb6ec0c9c", size = 6269008, upload-time = "2026-02-11T04:21:36.623Z" }, + { url = "https://files.pythonhosted.org/packages/3d/1d/6d875422c9f28a4a361f495a5f68d9de4a66941dc2c619103ca335fa6446/pillow-12.1.1-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:691ab2ac363b8217f7d31b3497108fb1f50faab2f75dfb03284ec2f217e87bf8", size = 8073226, upload-time = "2026-02-11T04:21:38.585Z" }, + { url = "https://files.pythonhosted.org/packages/a1/cd/134b0b6ee5eda6dc09e25e24b40fdafe11a520bc725c1d0bbaa5e00bf95b/pillow-12.1.1-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e9e8064fb1cc019296958595f6db671fba95209e3ceb0c4734c9baf97de04b20", size = 6380136, upload-time = "2026-02-11T04:21:40.562Z" }, + { url = "https://files.pythonhosted.org/packages/7a/a9/7628f013f18f001c1b98d8fffe3452f306a70dc6aba7d931019e0492f45e/pillow-12.1.1-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:472a8d7ded663e6162dafdf20015c486a7009483ca671cece7a9279b512fcb13", size = 7067129, upload-time = "2026-02-11T04:21:42.521Z" }, + { url = "https://files.pythonhosted.org/packages/1e/f8/66ab30a2193b277785601e82ee2d49f68ea575d9637e5e234faaa98efa4c/pillow-12.1.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:89b54027a766529136a06cfebeecb3a04900397a3590fd252160b888479517bf", size = 6491807, upload-time = "2026-02-11T04:21:44.22Z" }, + { url = "https://files.pythonhosted.org/packages/da/0b/a877a6627dc8318fdb84e357c5e1a758c0941ab1ddffdafd231983788579/pillow-12.1.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:86172b0831b82ce4f7877f280055892b31179e1576aa00d0df3bb1bbf8c3e524", size = 7190954, upload-time = "2026-02-11T04:21:46.114Z" }, + { url = "https://files.pythonhosted.org/packages/83/43/6f732ff85743cf746b1361b91665d9f5155e1483817f693f8d57ea93147f/pillow-12.1.1-cp313-cp313t-win32.whl", hash = "sha256:44ce27545b6efcf0fdbdceb31c9a5bdea9333e664cda58a7e674bb74608b3986", size = 6336441, upload-time = "2026-02-11T04:21:48.22Z" }, + { url = "https://files.pythonhosted.org/packages/3b/44/e865ef3986611bb75bfabdf94a590016ea327833f434558801122979cd0e/pillow-12.1.1-cp313-cp313t-win_amd64.whl", hash = "sha256:a285e3eb7a5a45a2ff504e31f4a8d1b12ef62e84e5411c6804a42197c1cf586c", size = 7045383, upload-time = "2026-02-11T04:21:50.015Z" }, + { url = "https://files.pythonhosted.org/packages/a8/c6/f4fb24268d0c6908b9f04143697ea18b0379490cb74ba9e8d41b898bd005/pillow-12.1.1-cp313-cp313t-win_arm64.whl", hash = "sha256:cc7d296b5ea4d29e6570dabeaed58d31c3fea35a633a69679fb03d7664f43fb3", size = 2456104, upload-time = "2026-02-11T04:21:51.633Z" }, + { url = "https://files.pythonhosted.org/packages/03/d0/bebb3ffbf31c5a8e97241476c4cf8b9828954693ce6744b4a2326af3e16b/pillow-12.1.1-cp314-cp314-ios_13_0_arm64_iphoneos.whl", hash = "sha256:417423db963cb4be8bac3fc1204fe61610f6abeed1580a7a2cbb2fbda20f12af", size = 4062652, upload-time = "2026-02-11T04:21:53.19Z" }, + { url = "https://files.pythonhosted.org/packages/2d/c0/0e16fb0addda4851445c28f8350d8c512f09de27bbb0d6d0bbf8b6709605/pillow-12.1.1-cp314-cp314-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:b957b71c6b2387610f556a7eb0828afbe40b4a98036fc0d2acfa5a44a0c2036f", size = 4138823, upload-time = "2026-02-11T04:22:03.088Z" }, + { url = "https://files.pythonhosted.org/packages/6b/fb/6170ec655d6f6bb6630a013dd7cf7bc218423d7b5fa9071bf63dc32175ae/pillow-12.1.1-cp314-cp314-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:097690ba1f2efdeb165a20469d59d8bb03c55fb6621eb2041a060ae8ea3e9642", size = 3601143, upload-time = "2026-02-11T04:22:04.909Z" }, + { url = "https://files.pythonhosted.org/packages/59/04/dc5c3f297510ba9a6837cbb318b87dd2b8f73eb41a43cc63767f65cb599c/pillow-12.1.1-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:2815a87ab27848db0321fb78c7f0b2c8649dee134b7f2b80c6a45c6831d75ccd", size = 5266254, upload-time = "2026-02-11T04:22:07.656Z" }, + { url = "https://files.pythonhosted.org/packages/05/30/5db1236b0d6313f03ebf97f5e17cda9ca060f524b2fcc875149a8360b21c/pillow-12.1.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:f7ed2c6543bad5a7d5530eb9e78c53132f93dfa44a28492db88b41cdab885202", size = 4657499, upload-time = "2026-02-11T04:22:09.613Z" }, + { url = "https://files.pythonhosted.org/packages/6f/18/008d2ca0eb612e81968e8be0bbae5051efba24d52debf930126d7eaacbba/pillow-12.1.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:652a2c9ccfb556235b2b501a3a7cf3742148cd22e04b5625c5fe057ea3e3191f", size = 6232137, upload-time = "2026-02-11T04:22:11.434Z" }, + { url = "https://files.pythonhosted.org/packages/70/f1/f14d5b8eeb4b2cd62b9f9f847eb6605f103df89ef619ac68f92f748614ea/pillow-12.1.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:d6e4571eedf43af33d0fc233a382a76e849badbccdf1ac438841308652a08e1f", size = 8042721, upload-time = "2026-02-11T04:22:13.321Z" }, + { url = "https://files.pythonhosted.org/packages/5a/d6/17824509146e4babbdabf04d8171491fa9d776f7061ff6e727522df9bd03/pillow-12.1.1-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b574c51cf7d5d62e9be37ba446224b59a2da26dc4c1bb2ecbe936a4fb1a7cb7f", size = 6347798, upload-time = "2026-02-11T04:22:15.449Z" }, + { url = "https://files.pythonhosted.org/packages/d1/ee/c85a38a9ab92037a75615aba572c85ea51e605265036e00c5b67dfafbfe2/pillow-12.1.1-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a37691702ed687799de29a518d63d4682d9016932db66d4e90c345831b02fb4e", size = 7039315, upload-time = "2026-02-11T04:22:17.24Z" }, + { url = "https://files.pythonhosted.org/packages/ec/f3/bc8ccc6e08a148290d7523bde4d9a0d6c981db34631390dc6e6ec34cacf6/pillow-12.1.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:f95c00d5d6700b2b890479664a06e754974848afaae5e21beb4d83c106923fd0", size = 6462360, upload-time = "2026-02-11T04:22:19.111Z" }, + { url = "https://files.pythonhosted.org/packages/f6/ab/69a42656adb1d0665ab051eec58a41f169ad295cf81ad45406963105408f/pillow-12.1.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:559b38da23606e68681337ad74622c4dbba02254fc9cb4488a305dd5975c7eeb", size = 7165438, upload-time = "2026-02-11T04:22:21.041Z" }, + { url = "https://files.pythonhosted.org/packages/02/46/81f7aa8941873f0f01d4b55cc543b0a3d03ec2ee30d617a0448bf6bd6dec/pillow-12.1.1-cp314-cp314-win32.whl", hash = "sha256:03edcc34d688572014ff223c125a3f77fb08091e4607e7745002fc214070b35f", size = 6431503, upload-time = "2026-02-11T04:22:22.833Z" }, + { url = "https://files.pythonhosted.org/packages/40/72/4c245f7d1044b67affc7f134a09ea619d4895333d35322b775b928180044/pillow-12.1.1-cp314-cp314-win_amd64.whl", hash = "sha256:50480dcd74fa63b8e78235957d302d98d98d82ccbfac4c7e12108ba9ecbdba15", size = 7176748, upload-time = "2026-02-11T04:22:24.64Z" }, + { url = "https://files.pythonhosted.org/packages/e4/ad/8a87bdbe038c5c698736e3348af5c2194ffb872ea52f11894c95f9305435/pillow-12.1.1-cp314-cp314-win_arm64.whl", hash = "sha256:5cb1785d97b0c3d1d1a16bc1d710c4a0049daefc4935f3a8f31f827f4d3d2e7f", size = 2544314, upload-time = "2026-02-11T04:22:26.685Z" }, + { url = "https://files.pythonhosted.org/packages/6c/9d/efd18493f9de13b87ede7c47e69184b9e859e4427225ea962e32e56a49bc/pillow-12.1.1-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:1f90cff8aa76835cba5769f0b3121a22bd4eb9e6884cfe338216e557a9a548b8", size = 5268612, upload-time = "2026-02-11T04:22:29.884Z" }, + { url = "https://files.pythonhosted.org/packages/f8/f1/4f42eb2b388eb2ffc660dcb7f7b556c1015c53ebd5f7f754965ef997585b/pillow-12.1.1-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:1f1be78ce9466a7ee64bfda57bdba0f7cc499d9794d518b854816c41bf0aa4e9", size = 4660567, upload-time = "2026-02-11T04:22:31.799Z" }, + { url = "https://files.pythonhosted.org/packages/01/54/df6ef130fa43e4b82e32624a7b821a2be1c5653a5fdad8469687a7db4e00/pillow-12.1.1-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:42fc1f4677106188ad9a55562bbade416f8b55456f522430fadab3cef7cd4e60", size = 6269951, upload-time = "2026-02-11T04:22:33.921Z" }, + { url = "https://files.pythonhosted.org/packages/a9/48/618752d06cc44bb4aae8ce0cd4e6426871929ed7b46215638088270d9b34/pillow-12.1.1-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:98edb152429ab62a1818039744d8fbb3ccab98a7c29fc3d5fcef158f3f1f68b7", size = 8074769, upload-time = "2026-02-11T04:22:35.877Z" }, + { url = "https://files.pythonhosted.org/packages/c3/bd/f1d71eb39a72fa088d938655afba3e00b38018d052752f435838961127d8/pillow-12.1.1-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d470ab1178551dd17fdba0fef463359c41aaa613cdcd7ff8373f54be629f9f8f", size = 6381358, upload-time = "2026-02-11T04:22:37.698Z" }, + { url = "https://files.pythonhosted.org/packages/64/ef/c784e20b96674ed36a5af839305f55616f8b4f8aa8eeccf8531a6e312243/pillow-12.1.1-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6408a7b064595afcab0a49393a413732a35788f2a5092fdc6266952ed67de586", size = 7068558, upload-time = "2026-02-11T04:22:39.597Z" }, + { url = "https://files.pythonhosted.org/packages/73/cb/8059688b74422ae61278202c4e1ad992e8a2e7375227be0a21c6b87ca8d5/pillow-12.1.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:5d8c41325b382c07799a3682c1c258469ea2ff97103c53717b7893862d0c98ce", size = 6493028, upload-time = "2026-02-11T04:22:42.73Z" }, + { url = "https://files.pythonhosted.org/packages/c6/da/e3c008ed7d2dd1f905b15949325934510b9d1931e5df999bb15972756818/pillow-12.1.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:c7697918b5be27424e9ce568193efd13d925c4481dd364e43f5dff72d33e10f8", size = 7191940, upload-time = "2026-02-11T04:22:44.543Z" }, + { url = "https://files.pythonhosted.org/packages/01/4a/9202e8d11714c1fc5951f2e1ef362f2d7fbc595e1f6717971d5dd750e969/pillow-12.1.1-cp314-cp314t-win32.whl", hash = "sha256:d2912fd8114fc5545aa3a4b5576512f64c55a03f3ebcca4c10194d593d43ea36", size = 6438736, upload-time = "2026-02-11T04:22:46.347Z" }, + { url = "https://files.pythonhosted.org/packages/f3/ca/cbce2327eb9885476b3957b2e82eb12c866a8b16ad77392864ad601022ce/pillow-12.1.1-cp314-cp314t-win_amd64.whl", hash = "sha256:4ceb838d4bd9dab43e06c363cab2eebf63846d6a4aeaea283bbdfd8f1a8ed58b", size = 7182894, upload-time = "2026-02-11T04:22:48.114Z" }, + { url = "https://files.pythonhosted.org/packages/ec/d2/de599c95ba0a973b94410477f8bf0b6f0b5e67360eb89bcb1ad365258beb/pillow-12.1.1-cp314-cp314t-win_arm64.whl", hash = "sha256:7b03048319bfc6170e93bd60728a1af51d3dd7704935feb228c4d4faab35d334", size = 2546446, upload-time = "2026-02-11T04:22:50.342Z" }, ] [[package]] name = "pip" -version = "25.3" +version = "26.0.1" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/fe/6e/74a3f0179a4a73a53d66ce57fdb4de0080a8baa1de0063de206d6167acc2/pip-25.3.tar.gz", hash = "sha256:8d0538dbbd7babbd207f261ed969c65de439f6bc9e5dbd3b3b9a77f25d95f343", size = 1803014, upload-time = "2025-10-25T00:55:41.394Z" } +sdist = { url = "https://files.pythonhosted.org/packages/48/83/0d7d4e9efe3344b8e2fe25d93be44f64b65364d3c8d7bc6dc90198d5422e/pip-26.0.1.tar.gz", hash = "sha256:c4037d8a277c89b320abe636d59f91e6d0922d08a05b60e85e53b296613346d8", size = 1812747, upload-time = "2026-02-05T02:20:18.702Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/44/3c/d717024885424591d5376220b5e836c2d5293ce2011523c9de23ff7bf068/pip-25.3-py3-none-any.whl", hash = "sha256:9655943313a94722b7774661c21049070f6bbb0a1516bf02f7c8d5d9201514cd", size = 1778622, upload-time = "2025-10-25T00:55:39.247Z" }, + { url = "https://files.pythonhosted.org/packages/de/f0/c81e05b613866b76d2d1066490adf1a3dbc4ee9d9c839961c3fc8a6997af/pip-26.0.1-py3-none-any.whl", hash = "sha256:bdb1b08f4274833d62c1aa29e20907365a2ceb950410df15fc9521bad440122b", size = 1787723, upload-time = "2026-02-05T02:20:16.416Z" }, ] [[package]] @@ -3651,17 +3687,17 @@ wheels = [ [[package]] name = "protobuf" -version = "6.33.0" +version = "6.33.5" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/19/ff/64a6c8f420818bb873713988ca5492cba3a7946be57e027ac63495157d97/protobuf-6.33.0.tar.gz", hash = "sha256:140303d5c8d2037730c548f8c7b93b20bb1dc301be280c378b82b8894589c954", size = 443463, upload-time = "2025-10-15T20:39:52.159Z" } +sdist = { url = "https://files.pythonhosted.org/packages/ba/25/7c72c307aafc96fa87062aa6291d9f7c94836e43214d43722e86037aac02/protobuf-6.33.5.tar.gz", hash = "sha256:6ddcac2a081f8b7b9642c09406bc6a4290128fce5f471cddd165960bb9119e5c", size = 444465, upload-time = "2026-01-29T21:51:33.494Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/7e/ee/52b3fa8feb6db4a833dfea4943e175ce645144532e8a90f72571ad85df4e/protobuf-6.33.0-cp310-abi3-win32.whl", hash = "sha256:d6101ded078042a8f17959eccd9236fb7a9ca20d3b0098bbcb91533a5680d035", size = 425593, upload-time = "2025-10-15T20:39:40.29Z" }, - { url = "https://files.pythonhosted.org/packages/7b/c6/7a465f1825872c55e0341ff4a80198743f73b69ce5d43ab18043699d1d81/protobuf-6.33.0-cp310-abi3-win_amd64.whl", hash = "sha256:9a031d10f703f03768f2743a1c403af050b6ae1f3480e9c140f39c45f81b13ee", size = 436882, upload-time = "2025-10-15T20:39:42.841Z" }, - { url = "https://files.pythonhosted.org/packages/e1/a9/b6eee662a6951b9c3640e8e452ab3e09f117d99fc10baa32d1581a0d4099/protobuf-6.33.0-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:905b07a65f1a4b72412314082c7dbfae91a9e8b68a0cc1577515f8df58ecf455", size = 427521, upload-time = "2025-10-15T20:39:43.803Z" }, - { url = "https://files.pythonhosted.org/packages/10/35/16d31e0f92c6d2f0e77c2a3ba93185130ea13053dd16200a57434c882f2b/protobuf-6.33.0-cp39-abi3-manylinux2014_aarch64.whl", hash = "sha256:e0697ece353e6239b90ee43a9231318302ad8353c70e6e45499fa52396debf90", size = 324445, upload-time = "2025-10-15T20:39:44.932Z" }, - { url = "https://files.pythonhosted.org/packages/e6/eb/2a981a13e35cda8b75b5585aaffae2eb904f8f351bdd3870769692acbd8a/protobuf-6.33.0-cp39-abi3-manylinux2014_s390x.whl", hash = "sha256:e0a1715e4f27355afd9570f3ea369735afc853a6c3951a6afe1f80d8569ad298", size = 339159, upload-time = "2025-10-15T20:39:46.186Z" }, - { url = "https://files.pythonhosted.org/packages/21/51/0b1cbad62074439b867b4e04cc09b93f6699d78fd191bed2bbb44562e077/protobuf-6.33.0-cp39-abi3-manylinux2014_x86_64.whl", hash = "sha256:35be49fd3f4fefa4e6e2aacc35e8b837d6703c37a2168a55ac21e9b1bc7559ef", size = 323172, upload-time = "2025-10-15T20:39:47.465Z" }, - { url = "https://files.pythonhosted.org/packages/07/d1/0a28c21707807c6aacd5dc9c3704b2aa1effbf37adebd8caeaf68b17a636/protobuf-6.33.0-py3-none-any.whl", hash = "sha256:25c9e1963c6734448ea2d308cfa610e692b801304ba0908d7bfa564ac5132995", size = 170477, upload-time = "2025-10-15T20:39:51.311Z" }, + { url = "https://files.pythonhosted.org/packages/b1/79/af92d0a8369732b027e6d6084251dd8e782c685c72da161bd4a2e00fbabb/protobuf-6.33.5-cp310-abi3-win32.whl", hash = "sha256:d71b040839446bac0f4d162e758bea99c8251161dae9d0983a3b88dee345153b", size = 425769, upload-time = "2026-01-29T21:51:21.751Z" }, + { url = "https://files.pythonhosted.org/packages/55/75/bb9bc917d10e9ee13dee8607eb9ab963b7cf8be607c46e7862c748aa2af7/protobuf-6.33.5-cp310-abi3-win_amd64.whl", hash = "sha256:3093804752167bcab3998bec9f1048baae6e29505adaf1afd14a37bddede533c", size = 437118, upload-time = "2026-01-29T21:51:24.022Z" }, + { url = "https://files.pythonhosted.org/packages/a2/6b/e48dfc1191bc5b52950246275bf4089773e91cb5ba3592621723cdddca62/protobuf-6.33.5-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:a5cb85982d95d906df1e2210e58f8e4f1e3cdc088e52c921a041f9c9a0386de5", size = 427766, upload-time = "2026-01-29T21:51:25.413Z" }, + { url = "https://files.pythonhosted.org/packages/4e/b1/c79468184310de09d75095ed1314b839eb2f72df71097db9d1404a1b2717/protobuf-6.33.5-cp39-abi3-manylinux2014_aarch64.whl", hash = "sha256:9b71e0281f36f179d00cbcb119cb19dec4d14a81393e5ea220f64b286173e190", size = 324638, upload-time = "2026-01-29T21:51:26.423Z" }, + { url = "https://files.pythonhosted.org/packages/c5/f5/65d838092fd01c44d16037953fd4c2cc851e783de9b8f02b27ec4ffd906f/protobuf-6.33.5-cp39-abi3-manylinux2014_s390x.whl", hash = "sha256:8afa18e1d6d20af15b417e728e9f60f3aa108ee76f23c3b2c07a2c3b546d3afd", size = 339411, upload-time = "2026-01-29T21:51:27.446Z" }, + { url = "https://files.pythonhosted.org/packages/9b/53/a9443aa3ca9ba8724fdfa02dd1887c1bcd8e89556b715cfbacca6b63dbec/protobuf-6.33.5-cp39-abi3-manylinux2014_x86_64.whl", hash = "sha256:cbf16ba3350fb7b889fca858fb215967792dc125b35c7976ca4818bee3521cf0", size = 323465, upload-time = "2026-01-29T21:51:28.925Z" }, + { url = "https://files.pythonhosted.org/packages/57/bf/2086963c69bdac3d7cff1cc7ff79b8ce5ea0bec6797a017e1be338a46248/protobuf-6.33.5-py3-none-any.whl", hash = "sha256:69915a973dd0f60f31a08b8318b73eab2bd6a392c79184b3612226b0a3f8ec02", size = 170687, upload-time = "2026-01-29T21:51:32.557Z" }, ] [[package]] @@ -4072,11 +4108,11 @@ wheels = [ [[package]] name = "python-multipart" -version = "0.0.20" +version = "0.0.22" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/f3/87/f44d7c9f274c7ee665a29b885ec97089ec5dc034c7f3fafa03da9e39a09e/python_multipart-0.0.20.tar.gz", hash = "sha256:8dd0cab45b8e23064ae09147625994d090fa46f5b0d1e13af944c331a7fa9d13", size = 37158, upload-time = "2024-12-16T19:45:46.972Z" } +sdist = { url = "https://files.pythonhosted.org/packages/94/01/979e98d542a70714b0cb2b6728ed0b7c46792b695e3eaec3e20711271ca3/python_multipart-0.0.22.tar.gz", hash = "sha256:7340bef99a7e0032613f56dc36027b959fd3b30a787ed62d310e951f7c3a3a58", size = 37612, upload-time = "2026-01-25T10:15:56.219Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/45/58/38b5afbc1a800eeea951b9285d3912613f2603bdf897a4ab0f4bd7f405fc/python_multipart-0.0.20-py3-none-any.whl", hash = "sha256:8a62d3a8335e06589fe01f2a3e178cdcc632f3fbe0d492ad9ee0ec35aab1f104", size = 24546, upload-time = "2024-12-16T19:45:44.423Z" }, + { url = "https://files.pythonhosted.org/packages/1b/d0/397f9626e711ff749a95d96b7af99b9c566a9bb5129b8e4c10fc4d100304/python_multipart-0.0.22-py3-none-any.whl", hash = "sha256:2b2cd894c83d21bf49d702499531c7bafd057d730c201782048f7945d82de155", size = 24579, upload-time = "2026-01-25T10:15:54.811Z" }, ] [[package]]