Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
102 changes: 102 additions & 0 deletions .github/skills/fix-nightly-warnings/SKILL.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,102 @@
---
name: fix-nightly-warnings
description: 'Fix ITK nightly build errors or compilation warnings reported on CDash. Use when: addressing CDash nightly failures. Creates a branch, fixes warnings, and opens a PR upstream.'
argument-hint: What warnings should this skill fix?
---

# Fix ITK Nightly Build Errors and Warnings

Creates a focused branch containing fixes for errors or warnings reported on the ITK CDash nightly dashboard, then opens a PR upstream.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

opens -> open


## When to Use

- CDash nightly build reports new errors, warnings, or Doxygen warnings
- User says "fix nightly errors", "address CDash warnings", or "there are new Doxygen warnings"

## Available Scripts

- **`scripts/list_nightly_warnings.py`** — Lists CDash builds that have warnings or errors. Defaults to `Nightly` builds from the last 24 hours.
- **`scripts/get_build_warnings.py`** — Fetches and summarizes warnings (or errors) for a specific CDash build ID, grouped by source file and warning flag.

Run `python3 scripts/<script>.py --help` for full usage.

## Procedure

### 1. Identify the Warnings

Use the provided scripts to fetch the current nightly builds and their warnings from CDash.

**Step 1a — List nightly builds with warnings:**

```bash
python3 scripts/list_nightly_warnings.py --type Nightly -limit 25 --json | jq '.[] | select(.warnings > 0)'
```

Note: `list_nightly_warnings.py` returns the builds with the most errors then warnings.


**Step 1b — Inspect warnings for a specific build:**

```bash
python3 scripts/get_build_warnings.py --limit 200 --json BUILD_ID | jq 'group_by(.flag) | .[] | {flag: .[0].flag, count: length}'
```

---

For each build with errors and warnings, fetch the details and summarize the errors and warnings by type and source file.
IGNORE ALL errors and warnings originating from `Modules/ThirdParty/` paths.


If there are build errors, only fix those. If there are warnings, prioritize fixing the most common warning flag that affects the most files.


### 2. Analyze the Root Cause

For each warning or error type identified in step 1, determine the root cause before editing files:
- Look up the compiler flag (e.g. `-Wthread-safety-negative`) in the compiler documentation.
- Read the affected source files to understand how they are structured.
- Identify the minimal fix: a missing annotation, a suppression pragma, a corrected API usage, etc.
- Confirm that warnings from `Modules/ThirdParty/` are skipped entirely.

### 3. Create a New Branch

```bash
git fetch upstream
git checkout -b fix-<warning-type>-warnings upstream/main
```

Example: `fix-doxygen-group-warnings`

### 4. Fix the Source Files

Determine the root cause of each error or warning. Apply the necessary fixes to the affected files to resolve the warnings. Make the minimal changes needed to fix the warnings, avoid changing unrelated documentation, coding or formatting.

### 5. Verify No New Warnings Introduced

Build and test to confirm that the fixes removed the targeted errors and warnings and did not introduce new ones.

### 6. Commit the Changes

Follow the ITK commit message standards. Include a clear description of the fix and include the error or warning message being addressed.

### 7. Draft a Pull Request

Do the following:
- Draft a pull request description that includes a summary of the changes, the warnings or errors fixed, and reference the CDash build if applicable.
- Request the User to review and approve the description before submitting the PR.
- Push the branch to the user's remote
- Create a DRAFT pull request against the current `upstream/main` branch.

## Quality Checks

Before declaring done:
[] All targeted warnings are fixed
[] No new warnings or errors introduced
[] Changes are limited to the files affected by the warnings
[] Commit message clearly describes the fix and references the CDash issue if applicable

## Key Files for Reference

| File | Purpose |
|------|---------|
| `Documentation/docs/contributing/index.md` | Contributing guidelines |
212 changes: 212 additions & 0 deletions .github/skills/fix-nightly-warnings/scripts/get_build_warnings.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,212 @@
#!/usr/bin/env python3
# /// script
# requires-python = ">=3.8"
# ///
"""
get_build_warnings.py — Fetch and summarize warnings or errors for a CDash build.

Queries the CDash GraphQL API for all warning (or error) entries associated with
a specific build ID, then groups them by source file and warning flag.

Exit codes:
0 Success
1 Argument error
2 Network or API error
"""

import argparse
import json
import re
import sys
import urllib.error
import urllib.request

CDASH_GRAPHQL = "https://open.cdash.org/graphql"

QUERY_TEMPLATE = """
{
build(id: "%s") {
name stamp startTime buildWarningsCount buildErrorsCount
site { name }
buildErrors(filters: { eq: { type: %s } }, first: %d%s) {
pageInfo { hasNextPage endCursor }
edges {
node { sourceFile sourceLine stdError stdOutput }
}
}
}
}
"""

WARNING_FLAG_RE = re.compile(r'\[(-W[^\]]+)\]')


def graphql(query: str) -> dict:
payload = json.dumps({"query": query}).encode()
req = urllib.request.Request(
CDASH_GRAPHQL,
data=payload,
headers={"Content-Type": "application/json"},
)
try:
with urllib.request.urlopen(req, timeout=30) as r:
return json.loads(r.read())
except urllib.error.URLError as e:
print(f"Error: network request failed: {e}", file=sys.stderr)
sys.exit(2)


def extract_flag(text: str) -> str:
m = WARNING_FLAG_RE.search(text)
return m.group(1) if m else "?"


def fetch_entries(build_id: str, error_type: str, limit: int) -> tuple[dict, list]:
"""Fetch all entries up to limit, following pagination if needed."""
all_entries = []
cursor = None
build_meta = None

while True:
after_clause = f', after: "{cursor}"' if cursor else ""
query = QUERY_TEMPLATE % (build_id, error_type, min(limit, 200), after_clause)
data = graphql(query)

if "errors" in data:
print(f"Error: GraphQL returned errors: {data['errors']}", file=sys.stderr)
sys.exit(2)

build = data["data"]["build"]
if build_meta is None:
build_meta = {
"name": build["name"],
"stamp": build["stamp"],
"site": build["site"]["name"],
"buildWarningsCount": build["buildWarningsCount"],
"buildErrorsCount": build["buildErrorsCount"],
}

page = build["buildErrors"]
all_entries.extend(e["node"] for e in page["edges"])

if not page["pageInfo"]["hasNextPage"] or len(all_entries) >= limit:
break
cursor = page["pageInfo"]["endCursor"]

return build_meta, all_entries[:limit]


def main() -> None:
parser = argparse.ArgumentParser(
prog="get_build_warnings.py",
description="Fetch and summarize warnings or errors for a CDash build.",
epilog=(
"Examples:\n"
" python3 scripts/get_build_warnings.py 11107692\n"
" python3 scripts/get_build_warnings.py 11107692 --raw\n"
" python3 scripts/get_build_warnings.py 11107692 --errors\n"
" python3 scripts/get_build_warnings.py 11107692 --json | jq '.entries[] | select(.flag == \"-Wthread-safety-negative\")'\n"
" python3 scripts/get_build_warnings.py 11107692 --limit 500"
),
formatter_class=argparse.RawDescriptionHelpFormatter,
)
parser.add_argument(
"build_id",
metavar="BUILD_ID",
help="CDash build ID (integer from list_nightly_warnings.py output or CDash URL)",
)
parser.add_argument(
"--errors",
action="store_true",
help="Fetch build errors instead of warnings (default: warnings)",
)
parser.add_argument(
"--raw",
action="store_true",
help="Print one entry per line with file:line, flag, and message snippet instead of grouping",
)
parser.add_argument(
"--json",
action="store_true",
dest="json_output",
help="Output as JSON: {build: {...}, entries: [...]} for programmatic use",
)
parser.add_argument(
"--limit",
type=int,
default=200,
metavar="N",
help="Maximum number of entries to retrieve (default: 200)",
)
args = parser.parse_args()

error_type = "ERROR" if args.errors else "WARNING"
label = "error" if args.errors else "warning"

build_meta, entries = fetch_entries(args.build_id, error_type, args.limit)
total = build_meta["buildErrorsCount" if args.errors else "buildWarningsCount"] or 0

if args.json_output:
output = {
"build": build_meta,
"type": error_type,
"totalCount": total,
"fetchedCount": len(entries),
"entries": [
{
"sourceFile": n["sourceFile"],
"sourceLine": n["sourceLine"],
"flag": extract_flag(n["stdError"] or n["stdOutput"] or ""),
"stdError": n["stdError"],
"stdOutput": n["stdOutput"],
}
for n in entries
],
}
print(json.dumps(output, indent=2))
return

# Diagnostics to stderr so stdout stays parseable
print(f"Build: {build_meta['name']}", file=sys.stderr)
print(f"Site: {build_meta['site']}", file=sys.stderr)
print(f"Stamp: {build_meta['stamp']}", file=sys.stderr)
print(f"Total {label}s: {total} (fetched: {len(entries)})", file=sys.stderr)
if total > len(entries):
print(
f"Note: {total - len(entries)} entries not fetched; use --limit to increase.",
file=sys.stderr,
)
print(file=sys.stderr)

if args.raw:
print(f"{'FILE:LINE':<60} {'FLAG':<35} SNIPPET")
print("-" * 120)
for n in entries:
loc = f"{n['sourceFile']}:{n['sourceLine']}"
text = n["stdError"] or n["stdOutput"] or ""
flag = extract_flag(text)
# Extract a short message snippet after the flag
idx = text.find(flag)
if idx >= 0:
snippet = text[idx + len(flag):].strip().replace("\n", " ")[:80]
else:
snippet = text.replace("\n", " ")[:80]
print(f"{loc:<60} {flag:<35} {snippet}")
else:
# Group by (sourceFile, flag)
groups: dict = {}
for n in entries:
src = n["sourceFile"] or "<unknown>"
text = n["stdError"] or n["stdOutput"] or ""
flag = extract_flag(text)
key = (src, flag)
groups[key] = groups.get(key, 0) + 1

print(f"{'COUNT':>6} {'FLAG':<35} SOURCE FILE")
print("-" * 100)
for (src, flag), count in sorted(groups.items(), key=lambda x: (-x[1], x[0][0])):
print(f"{count:>6} {flag:<35} {src}")


if __name__ == "__main__":
main()
Loading
Loading