Skip to content
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion people-and-planet-ai/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ def bucket_name(test_name: str, location: str, unique_id: str) -> Iterable[str]:
# Try to remove all files before deleting the bucket.
# Deleting a bucket with too many files results in an error.
try:
run_cmd("gsutil", "-m", "rm", "-rf", f"gs://{bucket_name}/*")
run_cmd("gcloud", "storage", "rm", "--recursive", f"gs://{bucket_name}/**")
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

For clarity and to follow documented best practices, you can remove the /** wildcard. The gcloud storage rm --recursive command, when applied to a bucket URL, is sufficient to delete all objects within it.

The official documentation and migration guides recommend gcloud storage rm gs://my-bucket --recursive as the idiomatic way to empty a bucket. This makes the command cleaner.

Suggested change
run_cmd("gcloud", "storage", "rm", "--recursive", f"gs://{bucket_name}/**")
run_cmd("gcloud", "storage", "rm", "--recursive", f"gs://{bucket_name}")

except RuntimeError:
# If no files were found and it fails, ignore the error.
pass
Expand Down