Trying out weaver-live check to validate gen-ai instrumentations in python in https://github.com/open-telemetry/opentelemetry-python-contrib/compare/main...lmolkova:opentelemetry-python-contrib:add-weaver-live-check-for-gen-ai?expand=1 (mostly https://github.com/lmolkova/opentelemetry-python-contrib/blob/5a53eb3833a3560dab4c5b41ebe3d9e7f849e101/util/opentelemetry-util-genai/tests/weaver_container.py)
It would be much easier (no volume mapping, permissions, IO handling, reading logs) to return response or errors when /stop is invoked in HTTP response.
I.e. /stop would return
- full json report (when validation succeed)
- list of errors (in json) if validation has failed (e.g. because of jq or rego errors)
Also got this feedback at KubeCon from @pellared
Trying out weaver-live check to validate gen-ai instrumentations in python in https://github.com/open-telemetry/opentelemetry-python-contrib/compare/main...lmolkova:opentelemetry-python-contrib:add-weaver-live-check-for-gen-ai?expand=1 (mostly https://github.com/lmolkova/opentelemetry-python-contrib/blob/5a53eb3833a3560dab4c5b41ebe3d9e7f849e101/util/opentelemetry-util-genai/tests/weaver_container.py)
It would be much easier (no volume mapping, permissions, IO handling, reading logs) to return response or errors when /stop is invoked in HTTP response.
I.e.
/stopwould returnAlso got this feedback at KubeCon from @pellared