Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 23 additions & 3 deletions .github/workflows/test-suite.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,26 @@ jobs:
uv run ruff check
uv run ruff format --check

# Step 2: Build (only after linting passes)
# Step 2a: Proto sub-project unit tests (runs in parallel with build)
proto-unit-tests:
runs-on: ubuntu-22.04
needs: lint-check
steps:
- uses: actions/checkout@v4

- name: Set up uv
uses: astral-sh/setup-uv@v6
with:
enable-cache: true
cache-dependency-glob: "otdf-python-proto/uv.lock"

- name: Run otdf-python-proto unit tests
working-directory: otdf-python-proto
run: |
uv sync --frozen --group dev
uv run pytest --tb=short -v tests/

# Step 2b: Build (only after linting passes)
build:
runs-on: ubuntu-22.04
needs: lint-check
Expand Down Expand Up @@ -99,21 +118,22 @@ jobs:

report:
runs-on: ubuntu-22.04
needs: [lint-check, build, unit-tests, integration-tests]
needs: [lint-check, proto-unit-tests, build, unit-tests, integration-tests]
if: always()
outputs:
success: ${{ steps.check.outputs.success }}
steps:
- name: Check all jobs succeeded
id: check
run: |
if [[ "${{ needs.lint-check.result }}" == "success" && "${{ needs.build.result }}" == "success" && "${{ needs.unit-tests.result }}" == "success" && "${{ needs.integration-tests.result }}" == "success" ]]; then
if [[ "${{ needs.lint-check.result }}" == "success" && "${{ needs.proto-unit-tests.result }}" == "success" && "${{ needs.build.result }}" == "success" && "${{ needs.unit-tests.result }}" == "success" && "${{ needs.integration-tests.result }}" == "success" ]]; then
echo "success=true" >> $GITHUB_OUTPUT
echo "✅ All tests passed!"
else
echo "success=false" >> $GITHUB_OUTPUT
echo "❌ Some tests failed:"
echo " Lint Check: ${{ needs.lint-check.result }}"
echo " Proto Unit Tests: ${{ needs.proto-unit-tests.result }}"
echo " Build: ${{ needs.build.result }}"
echo " Unit Tests: ${{ needs.unit-tests.result }}"
echo " Integration Tests: ${{ needs.integration-tests.result }}"
Expand Down
2 changes: 1 addition & 1 deletion docs/DEVELOPING.md
Original file line number Diff line number Diff line change
Expand Up @@ -92,4 +92,4 @@ cd otdf-python-proto
uv run python scripts/generate_connect_proto.py
```

See [`otdf-python-proto/README.md`](../otdf-python-proto/README.md) and [`PROTOBUF_SETUP.md`](./PROTOBUF_SETUP.md) for details.
See [`otdf-python-proto/README.md`](../otdf-python-proto/README.md) and [`CONNECT_RPC.md`](./CONNECT_RPC.md) for details.
2 changes: 1 addition & 1 deletion otdf-python-proto/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -170,7 +170,7 @@ If you're migrating from traditional gRPC clients to Connect RPC:
Install buf: `brew install bufbuild/buf/buf`

### "protoc-gen-connect_python not found"
Install with compiler support: `uv add connect-python[compiler]`
Run the setup script: `uv run python scripts/setup_connect_rpc.py`

### Import errors after generation
Ensure `__init__.py` files exist in otdf_python_proto directories
Expand Down
4 changes: 4 additions & 0 deletions otdf-python-proto/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,12 @@ build-backend = "hatchling.build"
[dependency-groups]
dev = [
"mypy-protobuf>=3.6.0",
"pytest>=8.0.0",
]

[tool.pytest.ini_options]
testpaths = ["tests"]

[tool.hatch.build.targets.wheel]
packages = ["src/otdf_python_proto"]

Expand Down
30 changes: 13 additions & 17 deletions otdf-python-proto/scripts/build_connect_proto.sh
Original file line number Diff line number Diff line change
Expand Up @@ -44,28 +44,28 @@ fi

echo "✓ uv is available"

# Install dependencies if needed
# Install dependencies
echo "Installing/updating dependencies..."
cd "$PROTO_GEN_DIR"
uv sync --dev
uv sync
Comment on lines +47 to +50
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This script runs uv sync, and then if connectrpc is not found, it advises the user to run setup_connect_rpc.py, which also just runs uv sync. This is redundant and potentially confusing.

To improve clarity, I suggest removing the uv sync from this build script. The script should focus on generation and assume dependencies are already installed. The existing check for connect-python will then correctly guide the user to run the setup script if needed.

Suggested change
# Install dependencies
echo "Installing/updating dependencies..."
cd "$PROTO_GEN_DIR"
uv sync --dev
uv sync
cd "$PROTO_GEN_DIR"


# Check if connect-python is available
if ! uv run python -c "import connectrpc" 2>/dev/null; then
echo "Installing connect-python[compiler]..."
uv add "connect-python[compiler]>=0.4.2"
echo "Error: connect-python is not in the installed dependencies."
echo "It may need to be added first. Run: uv run python scripts/setup_connect_rpc.py"
echo "Then re-run this script."
exit 1
fi

echo "✓ connect-python is available"

# Clean up previous generated files
OUTPUT_DIR="$PROTO_GEN_DIR/src/otdf_python_proto"
echo "Cleaning up previous generated files..."
if [[ -d "generated" ]]; then
rm -rf generated/*
if [[ -d "$OUTPUT_DIR" ]]; then
rm -rf "${OUTPUT_DIR:?}"/*
fi

# Create generated directory
mkdir -p generated

# Run the generation
echo "Generating Connect RPC protobuf files..."
uv run python scripts/generate_connect_proto.py "$@"
Expand All @@ -75,16 +75,12 @@ if [[ $? -eq 0 ]]; then
echo "✓ Connect RPC generation complete!"
echo ""
echo "Generated files:"
echo " - generated/*_pb2.py (Protobuf message classes)"
echo " - generated/*_pb2.pyi (Type stubs)"
echo " - generated/*_connect.py (Connect RPC clients)"
echo " - src/otdf_python_proto/**/*_pb2.py (Protobuf message classes)"
echo " - src/otdf_python_proto/**/*_pb2.pyi (Type stubs)"
echo " - src/otdf_python_proto/**/*_connect.py (Connect RPC clients)"
echo ""
echo "Legacy gRPC files (if generated):"
echo " - generated/legacy_grpc/*_pb2_grpc.py (gRPC stubs)"
echo ""
echo "Usage examples:"
echo " cd .."
echo " python examples/connect_rpc_client_example.py"
echo " - src/otdf_python_proto/legacy_grpc/**/*_pb2_grpc.py (gRPC stubs)"
echo ""
echo "For more information, see:"
echo " - docs/CONNECT_RPC.md"
Expand Down
51 changes: 29 additions & 22 deletions otdf-python-proto/scripts/generate_connect_proto.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,9 @@
4. Optionally generates legacy gRPC clients for backward compatibility
"""

import argparse
import re
Comment thread
b-long marked this conversation as resolved.
Comment thread
b-long marked this conversation as resolved.
import shutil
import subprocess
import sys
from pathlib import Path
Expand Down Expand Up @@ -43,9 +46,9 @@ def check_dependencies() -> bool:
return True


def copy_opentdf_proto_files(proto_gen_dir: Path) -> bool:
def copy_opentdf_proto_files(proto_gen_dir: Path, git_tag: str | None = None) -> bool:
"""Clone OpenTDF platform repository and copy all proto files."""
GIT_TAG = "service/v0.7.2"
GIT_TAG = git_tag or "service/v0.7.2"
REPO_URL = "https://github.com/opentdf/platform.git"

temp_repo_dir = proto_gen_dir / "temp_platform_repo"
Expand All @@ -57,7 +60,7 @@ def copy_opentdf_proto_files(proto_gen_dir: Path) -> bool:
try:
# Remove existing temp directory if it exists
if temp_repo_dir.exists():
subprocess.run(["rm", "-rf", str(temp_repo_dir)], check=True)
shutil.rmtree(temp_repo_dir)

print(f"Cloning OpenTDF platform repository (tag: {GIT_TAG})...")

Expand Down Expand Up @@ -120,17 +123,15 @@ def copy_opentdf_proto_files(proto_gen_dir: Path) -> bool:
finally:
# Clean up temp directory
if temp_repo_dir.exists():
subprocess.run(["rm", "-rf", str(temp_repo_dir)], check=False)
shutil.rmtree(temp_repo_dir)

return False


def download_proto_files(proto_gen_dir: Path) -> bool:
def download_proto_files(proto_gen_dir: Path, git_tag: str | None = None) -> bool:
"""Download proto files from OpenTDF platform."""
print("Copying proto files from OpenTDF platform...")

try:
return copy_opentdf_proto_files(proto_gen_dir)
return copy_opentdf_proto_files(proto_gen_dir, git_tag=git_tag)
except Exception as e:
print(f"Error getting proto files: {e}")
return False
Expand All @@ -152,14 +153,15 @@ def run_buf_generate(proto_gen_dir: Path) -> bool:
connect_plugin_path = result.stdout.strip()
print(f"Using Connect plugin at: {connect_plugin_path}")

# Update buf.gen.yaml with the correct path
# Update buf.gen.yaml with the correct absolute path for the local plugin
buf_gen_path = proto_gen_dir / "buf.gen.yaml"
with buf_gen_path.open() as f:
content = f.read()

# Replace the local plugin path
updated_content = content.replace(
"- local: protoc-gen-connect-python", f"- local: {connect_plugin_path}"
updated_content = re.sub(
r"- local:\s+\S*protoc-gen-connect[_-]python\S*",
lambda _: f"- local: {connect_plugin_path}",
Comment thread
b-long marked this conversation as resolved.
content,
)
Comment thread
b-long marked this conversation as resolved.
Comment thread
b-long marked this conversation as resolved.

with buf_gen_path.open("w") as f:
Expand Down Expand Up @@ -189,13 +191,9 @@ def run_buf_generate(proto_gen_dir: Path) -> bool:

def create_init_files(generated_dir: Path) -> None:
"""Create __init__.py files in generated directories."""
# Create __init__.py in main generated directory
(generated_dir / "__init__.py").touch()

# Create __init__.py files in any subdirectories
for subdir in generated_dir.iterdir():
if subdir.is_dir():
(subdir / "__init__.py").touch()
for dirpath in [generated_dir, *generated_dir.rglob("*")]:
if dirpath.is_dir():
(dirpath / "__init__.py").touch()


def _fix_ignore_if_default_value(proto_files_dir):
Expand Down Expand Up @@ -245,7 +243,7 @@ def main():
# Get the proto-gen directory (parent of scripts)
proto_gen_dir = Path(__file__).parent.parent
proto_files_dir = proto_gen_dir / "proto-files"
generated_dir = proto_gen_dir / "generated"
generated_dir = proto_gen_dir / "src" / "otdf_python_proto"

# Check dependencies
if not check_dependencies():
Expand All @@ -255,10 +253,19 @@ def main():
proto_files_dir.mkdir(exist_ok=True)
generated_dir.mkdir(exist_ok=True)

# Parse arguments
parser = argparse.ArgumentParser(description="OpenTDF Connect RPC Client Generator")
parser.add_argument("--tag", help="Git tag to use for OpenTDF platform")
parser.add_argument(
"--download", action="store_true", help="Force download of proto files"
)
args = parser.parse_args()
git_tag = args.tag

# Download proto files (optional - can use existing files)
if (
"--download" in sys.argv or not any(proto_files_dir.glob("**/*.proto"))
) and not download_proto_files(proto_gen_dir):
args.download or not any(proto_files_dir.glob("**/*.proto"))
) and not download_proto_files(proto_gen_dir, git_tag=git_tag):
return 1

# Check if we have any proto files
Expand Down
26 changes: 26 additions & 0 deletions otdf-python-proto/scripts/setup_connect_rpc.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
#!/usr/bin/env python3
"""Setup script for Connect RPC dependencies.

Run this script once to install the required tools before generating proto files:

uv run python scripts/setup_connect_rpc.py
"""

import subprocess
import sys


def main():
"""Install Connect RPC compiler dependencies."""
print("Installing Connect RPC dependencies...")
subprocess.run(
["uv", "sync"],
check=True,
)
print("✓ Connect RPC dependencies installed.")
print(" Run 'uv run python scripts/generate_connect_proto.py' to generate files.")
return 0


if __name__ == "__main__":
sys.exit(main())
Empty file.
Loading
Loading