impl(v3): update dockerfiles used in CI #15903
Merged
Google Cloud Build / prepare-for-v3-0-0-m32-pr (cloud-cpp-testing-resources)
succeeded
Jan 22, 2026 in 10m 51s
Summary
Build Information
| Trigger | prepare-for-v3-0-0-m32-pr |
| Build | 7d9b1a18-8dd0-4670-a2ae-15fe451fab56 |
| Start | 2026-01-22T14:53:02-08:00 |
| Duration | 9m26.878s |
| Status | SUCCESS |
Steps
| Step | Status | Duration |
|---|---|---|
| kaniko-build | SUCCESS | 2m29.702s |
| download-runner-image | SUCCESS | 41.681s |
| build.sh | SUCCESS | 6m1.123s |
| remove-image | SUCCESS | 2.958s |
| cancel-in-progress-builds-for-PR | SUCCESS | 50.852s |
Details
starting build "7d9b1a18-8dd0-4670-a2ae-15fe451fab56"
FETCHSOURCE
From https://github.com/googleapis/google-cloud-cpp
* branch 36ed552ca38ed6de3d08502b1b8b341fe3c98f6b -> FETCH_HEAD
Updating files: 6% (1499/22002)
Updating files: 7% (1541/22002)
Updating files: 8% (1761/22002)
Updating files: 9% (1981/22002)
Updating files: 10% (2201/22002)
Updating files: 11% (2421/22002)
Updating files: 12% (2641/22002)
Updating files: 13% (2861/22002)
Updating files: 14% (3081/22002)
Updating files: 15% (3301/22002)
Updating files: 16% (3521/22002)
Updating files: 17% (3741/22002)
Updating files: 18% (3961/22002)
Updating files: 19% (4181/22002)
Updating files: 20% (4401/22002)
Updating files: 21% (4621/22002)
Updating files: 22% (4841/22002)
Updating files: 23% (5061/22002)
Updating files: 24% (5281/22002)
Updating files: 25% (5501/22002)
Updating files: 26% (5721/22002)
Updating files: 27% (5941/22002)
Updating files: 28% (6161/22002)
Updating files: 29% (6381/22002)
Updating files: 30% (6601/22002)
Updating files: 31% (6821/22002)
Updating files: 32% (7041/22002)
Updating files: 33% (7261/22002)
Updating files: 34% (7481/22002)
Updating files: 35% (7701/22002)
Updating files: 36% (7921/22002)
Updating files: 37% (8141/22002)
Updating files: 38% (8361/22002)
Updating files: 39% (8581/22002)
Updating files: 40% (8801/22002)
Updating files: 41% (9021/22002)
Updating files: 42% (9241/22002)
Updating files: 43% (9461/22002)
Updating files: 44% (9681/22002)
Updating files: 45% (9901/22002)
Updating files: 46% (10121/22002)
Updating files: 47% (10341/22002)
Updating files: 48% (10561/22002)
Updating files: 49% (10781/22002)
Updating files: 50% (11001/22002)
Updating files: 51% (11222/22002)
Updating files: 52% (11442/22002)
Updating files: 53% (11662/22002)
Updating files: 54% (11882/22002)
Updating files: 55% (12102/22002)
Updating files: 56% (12322/22002)
Updating files: 57% (12542/22002)
Updating files: 58% (12762/22002)
Updating files: 59% (12982/22002)
Updating files: 60% (13202/22002)
Updating files: 61% (13422/22002)
Updating files: 62% (13642/22002)
Updating files: 63% (13862/22002)
Updating files: 64% (14082/22002)
Updating files: 65% (14302/22002)
Updating files: 66% (14522/22002)
Updating files: 67% (14742/22002)
Updating files: 68% (14962/22002)
Updating files: 69% (15182/22002)
Updating files: 70% (15402/22002)
Updating files: 71% (15622/22002)
Updating files: 72% (15842/22002)
Updating files: 73% (16062/22002)
Updating files: 74% (16282/22002)
Updating files: 75% (16502/22002)
Updating files: 76% (16722/22002)
Updating files: 77% (16942/22002)
Updating files: 78% (17162/22002)
Updating files: 79% (17382/22002)
Updating files: 80% (17602/22002)
Updating files: 81% (17822/22002)
Updating files: 82% (18042/22002)
Updating files: 83% (18262/22002)
Updating files: 84% (18482/22002)
Updating files: 85% (18702/22002)
Updating files: 86% (18922/22002)
Updating files: 87% (19142/22002)
Updating files: 88% (19362/22002)
Updating files: 88% (19518/22002)
Updating files: 89% (19582/22002)
Updating files: 90% (19802/22002)
Updating files: 91% (20022/22002)
Updating files: 92% (20242/22002)
Updating files: 93% (20462/22002)
Updating files: 94% (20682/22002)
Updating files: 95% (20902/22002)
Updating files: 96% (21122/22002)
Updating files: 97% (21342/22002)
Updating files: 98% (21562/22002)
Updating files: 99% (21782/22002)
Updating files: 100% (22002/22002)
Updating files: 100% (22002/22002), done.
HEAD is now at 36ed552c valgrind2
GitCommit:
36ed552ca38ed6de3d08502b1b8b341fe3c98f6b
BUILD
Starting Step #4 - "cancel-in-progress-builds-for-PR"
Starting Step #0 - "kaniko-build"
Step #0 - "kaniko-build": Pulling image: gcr.io/kaniko-project/executor:v1.24.0-debug
Step #4 - "cancel-in-progress-builds-for-PR": Pulling image: gcr.io/google.com/cloudsdktool/cloud-sdk
Step #4 - "cancel-in-progress-builds-for-PR": Using default tag: latest
Step #4 - "cancel-in-progress-builds-for-PR": latest: Pulling from google.com/cloudsdktool/cloud-sdk
Step #4 - "cancel-in-progress-builds-for-PR": c1be109a62df: Already exists
Step #0 - "kaniko-build": v1.24.0-debug: Pulling from kaniko-project/executor
Step #4 - "cancel-in-progress-builds-for-PR": 27121d504ff2: Pulling fs layer
Step #4 - "cancel-in-progress-builds-for-PR": 0cd08a207336: Pulling fs layer
Step #4 - "cancel-in-progress-builds-for-PR": 17dea0f58e0b: Pulling fs layer
Step #4 - "cancel-in-progress-builds-for-PR": ee043545b88a: Pulling fs layer
Step #4 - "cancel-in-progress-builds-for-PR": dce91e521e01: Pulling fs layer
Step #4 - "cancel-in-progress-builds-for-PR": 27121d504ff2: Download complete
Step #4 - "cancel-in-progress-builds-for-PR": ee043545b88a: Download complete
Step #4 - "cancel-in-progress-builds-for-PR": dce91e521e01: Verifying Checksum
Step #4 - "cancel-in-progress-builds-for-PR": dce91e521e01: Download complete
Step #0 - "kaniko-build": 2fc842204170: Pulling fs layer
Step #0 - "kaniko-build": 8d1674b25e7b: Pulling fs layer
Step #0 - "kaniko-build": 9bdd0371dbb4: Pulling fs layer
Step #0 - "kaniko-build": 9aeeda79b717: Pulling fs layer
Step #0 - "kaniko-build": aaab912f1a4e: Pulling fs layer
Step #0 - "kaniko-build": 0006a7b536fb: Pulling fs layer
Step #0 - "kaniko-build": 3d08e2063109: Pulling fs layer
Step #0 - "kaniko-build": fc4517ec89af: Pulling fs layer
Step #0 - "kaniko-build": b892afb8bf07: Pulling fs layer
Step #0 - "kaniko-build": cdc2b9345183: Pulling fs layer
Step #0 - "kaniko-build": 3934701e2c3d: Pulling fs layer
Step #0 - "kaniko-build": 46be974dd378: Pulling fs layer
Step #0 - "kaniko-build": 2b61f8ddf6d9: Pulling fs layer
Step #0 - "kaniko-build": aaab912f1a4e: Waiting
Step #0 - "kaniko-build": 0006a7b536fb: Waiting
Step #0 - "kaniko-build": 3d08e2063109: Waiting
Step #0 - "kaniko-build": 3934701e2c3d: Waiting
Step #0 - "kaniko-build": fc4517ec89af: Waiting
Step #0 - "kaniko-build": cdc2b9345183: Waiting
Step #0 - "kaniko-build": b892afb8bf07: Waiting
Step #0 - "kaniko-build": 2b61f8ddf6d9: Waiting
Step #4 - "cancel-in-progress-builds-for-PR": 27121d504ff2: Pull complete
Step #0 - "kaniko-build": 2fc842204170: Verifying Checksum
Step #0 - "kaniko-build": 2fc842204170: Download complete
Step #0 - "kaniko-build": 9bdd0371dbb4: Download complete
Step #0 - "kaniko-build": 8d1674b25e7b: Download complete
Step #0 - "kaniko-build": 2fc842204170: Pull complete
Step #0 - "kaniko-build": 9aeeda79b717: Verifying Checksum
Step #0 - "kaniko-build": 9aeeda79b717: Download complete
Step #0 - "kaniko-build": 8d1674b25e7b: Pull complete
Step #0 - "kaniko-build": 3d08e2063109: Download complete
Step #0 - "kaniko-build": 9bdd0371dbb4: Pull complete
Step #0 - "kaniko-build": aaab912f1a4e: Verifying Checksum
Step #0 - "kaniko-build": aaab912f1a4e: Download complete
Step #0 - "kaniko-build": 0006a7b536fb: Verifying Checksum
Step #0 - "kaniko-build": 0006a7b536fb: Download complete
Step #0 - "kaniko-build": fc4517ec89af: Download complete
Step #0 - "kaniko-build": 9aeeda79b717: Pull complete
Step #0 - "kaniko-build": 3934701e2c3d: Verifying Checksum
Step #0 - "kaniko-build": 3934701e2c3d: Download complete
Step #0 - "kaniko-build": 46be974dd378: Verifying Checksum
Step #0 - "kaniko-build": 46be974dd378: Download complete
Step #0 - "kaniko-build": aaab912f1a4e: Pull complete
Step #0 - "kaniko-build": b892afb8bf07: Verifying Checksum
Step #0 - "kaniko-build": b892afb8bf07: Download complete
Step #0 - "kaniko-build": cdc2b9345183: Verifying Checksum
Step #0 - "kaniko-build": cdc2b9345183: Download complete
Step #0 - "kaniko-build": 0006a7b536fb: Pull complete
Step #4 - "cancel-in-progress-builds-for-PR": 17dea0f58e0b: Verifying Checksum
Step #4 - "cancel-in-progress-builds-for-PR": 17dea0f58e0b: Download complete
Step #0 - "kaniko-build": 3d08e2063109: Pull complete
Step #0 - "kaniko-build": fc4517ec89af: Pull complete
Step #0 - "kaniko-build": b892afb8bf07: Pull complete
Step #0 - "kaniko-build": cdc2b9345183: Pull complete
Step #0 - "kaniko-build": 3934701e2c3d: Pull complete
Step #0 - "kaniko-build": 46be974dd378: Pull complete
Step #0 - "kaniko-build": 2b61f8ddf6d9: Pull complete
Step #0 - "kaniko-build": Digest: sha256:2562c4fe551399514277ffff7dcca9a3b1628c4ea38cb017d7286dc6ea52f4cd
Step #0 - "kaniko-build": Status: Downloaded newer image for gcr.io/kaniko-project/executor:v1.24.0-debug
Step #0 - "kaniko-build": gcr.io/kaniko-project/executor:v1.24.0-debug
Step #0 - "kaniko-build": time="2026-01-22T22:53:18Z" level=info msg="Using dockerignore file: /workspace/ci/.dockerignore"
Step #0 - "kaniko-build": time="2026-01-22T22:53:18Z" level=info msg="Retrieving image manifest fedora:40"
Step #0 - "kaniko-build": time="2026-01-22T22:53:18Z" level=info msg="Retrieving image fedora:40 from registry index.docker.io"
Step #0 - "kaniko-build": time="2026-01-22T22:53:18Z" level=info msg="Retrieving image manifest fedora:40"
Step #0 - "kaniko-build": time="2026-01-22T22:53:18Z" level=info msg="Returning cached image manifest"
Step #0 - "kaniko-build": time="2026-01-22T22:53:18Z" level=info msg="Built cross stage deps: map[]"
Step #0 - "kaniko-build": time="2026-01-22T22:53:18Z" level=info msg="Retrieving image manifest fedora:40"
Step #0 - "kaniko-build": time="2026-01-22T22:53:18Z" level=info msg="Returning cached image manifest"
Step #0 - "kaniko-build": time="2026-01-22T22:53:18Z" level=info msg="Retrieving image manifest fedora:40"
Step #0 - "kaniko-build": time="2026-01-22T22:53:18Z" level=info msg="Returning cached image manifest"
Step #0 - "kaniko-build": time="2026-01-22T22:53:18Z" level=info msg="Executing 0 build triggers"
Step #0 - "kaniko-build": time="2026-01-22T22:53:18Z" level=info msg="Building stage 'fedora:40' [idx: '0', base-idx: '-1']"
Step #0 - "kaniko-build": time="2026-01-22T22:53:18Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:afc0f5f5bbcaaaf7d1a92a51ce849027e05067aa3f7f585a771b9b92c066e42c..."
Step #0 - "kaniko-build": time="2026-01-22T22:53:19Z" level=info msg="Using caching version of cmd: RUN dnf makecache && dnf install -y cmake curl diffutils findutils gcc-c++ git make ninja-build patch tar unzip wget which zip"
Step #0 - "kaniko-build": time="2026-01-22T22:53:19Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:2bb3f636ffa4335715c5829fa45084e6538bb330a4a74ac18c90741560db4517..."
Step #0 - "kaniko-build": time="2026-01-22T22:53:19Z" level=info msg="Using caching version of cmd: RUN dnf makecache && dnf install -y protobuf-compiler grpc-cpp"
Step #0 - "kaniko-build": time="2026-01-22T22:53:19Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:59809c2bf009e5cab3bcd1d4d94153065a4d624f96b542c7c922dc0c78229678..."
Step #0 - "kaniko-build": time="2026-01-22T22:53:19Z" level=info msg="Using caching version of cmd: RUN dnf makecache && dnf install -y c-ares-devel.i686 glibc-devel.i686 gmock-devel.i686 google-benchmark-devel.i686 grpc-devel.i686 gtest-devel.i686 libcurl-devel.i686 libstdc++-devel.i686 openssl-devel.i686 protobuf-devel.i686 re2-devel.i686 zlib-ng-compat-devel.i686"
Step #0 - "kaniko-build": time="2026-01-22T22:53:19Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:6cf80c4d3157a33193f638728f84ee29f54ecf3bdb4e47cc7fec26c345e2a220..."
Step #0 - "kaniko-build": time="2026-01-22T22:53:19Z" level=info msg="Using caching version of cmd: RUN dnf makecache && dnf install -y python3-devel"
Step #0 - "kaniko-build": time="2026-01-22T22:53:19Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:75e7494dd10906caac9c0a2482c4f58bbe57eae856504bb2f8a7640b01bcc839..."
Step #0 - "kaniko-build": time="2026-01-22T22:53:19Z" level=info msg="Using caching version of cmd: RUN pip3 install --upgrade pip"
Step #0 - "kaniko-build": time="2026-01-22T22:53:19Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:3586f05e5e8a4bf2d44cfc2e3884532ccae60e8450cc85ea1cc70160e8ad28fc..."
Step #0 - "kaniko-build": time="2026-01-22T22:53:20Z" level=info msg="Using caching version of cmd: RUN pip3 install setuptools wheel"
Step #0 - "kaniko-build": time="2026-01-22T22:53:20Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:58cf47c0eac17f1f89d5554a0aa72c5138bf387f966204f2cb15a1fb6611499a..."
Step #0 - "kaniko-build": time="2026-01-22T22:53:20Z" level=info msg="Using caching version of cmd: RUN dnf makecache && dnf install -y java-latest-openjdk-devel"
Step #0 - "kaniko-build": time="2026-01-22T22:53:20Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:41ad1604974c6781938026bb85a384796743ef13c740d786c5092bb958529e22..."
Step #0 - "kaniko-build": time="2026-01-22T22:53:20Z" level=info msg="Using caching version of cmd: RUN echo 'root:cloudcxx' | chpasswd"
Step #0 - "kaniko-build": time="2026-01-22T22:53:20Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:ab14624c65d112dc5734be7526ea246228cf8a2acf4b53f5974ed32122715bbb..."
Step #0 - "kaniko-build": time="2026-01-22T22:53:20Z" level=info msg="Using caching version of cmd: RUN curl -fsSL https://distfiles.ariadne.space/pkgconf/pkgconf-2.2.0.tar.gz | tar -xzf - --strip-components=1 && ./configure --prefix=/usr --with-system-libdir=/lib64:/usr/lib64 --with-system-includedir=/usr/include && make -j ${NCPU:-4} && make install && ldconfig && cd /var/tmp && rm -fr build"
Step #0 - "kaniko-build": time="2026-01-22T22:53:20Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:7e23d26ed6e59ac644818046e9f3f7c8d1e44cecedbf3383905ff014af2fcc9f..."
Step #0 - "kaniko-build": time="2026-01-22T22:53:20Z" level=info msg="Using caching version of cmd: RUN curl -fsSL https://github.com/nlohmann/json/archive/v3.11.3.tar.gz | tar -xzf - --strip-components=1 && cmake -DCMAKE_BUILD_TYPE=Release -DBUILD_SHARED_LIBS=yes -DBUILD_TESTING=OFF -DJSON_BuildTests=OFF -S . -B cmake-out && cmake --build cmake-out --target install -- -j ${NCPU:-4} && ldconfig"
Step #0 - "kaniko-build": time="2026-01-22T22:53:20Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:f1823f8312906cdfd90a077211017a134badaabaab23c429729bd7b20755011e..."
Step #0 - "kaniko-build": time="2026-01-22T22:53:20Z" level=info msg="Using caching version of cmd: RUN curl -fsSL https://github.com/open-telemetry/opentelemetry-cpp/archive/v1.24.0.tar.gz | tar -xzf - --strip-components=1 && cmake -DCMAKE_CXX_STANDARD=17 -DCMAKE_CXX_COMPILER=g++ -DCMAKE_CXX_FLAGS=-m32 -DCMAKE_FIND_ROOT_PATH=/usr/ -DCMAKE_FIND_ROOT_PATH_MODE_PROGRAM=NEVER -DCMAKE_FIND_ROOT_PATH_MODE_LIBRARY=ONLY -DCMAKE_FIND_ROOT_PATH_MODE_INCLUDE=ONLY -DCMAKE_BUILD_TYPE=Release -DCMAKE_POSITION_INDEPENDENT_CODE=TRUE -DBUILD_SHARED_LIBS=ON -DWITH_EXAMPLES=OFF -DWITH_STL=CXX17 -DBUILD_TESTING=OFF -DOPENTELEMETRY_INSTALL=ON -DOPENTELEMETRY_ABI_VERSION_NO=2 -S . -B cmake-out && cmake --build cmake-out --target install -- -j ${NCPU:-4} && ldconfig"
Step #0 - "kaniko-build": time="2026-01-22T22:53:20Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:2660b9977453aee214daa36b856872d12ab7a2d60361c380bd52261cca8c27f5..."
Step #0 - "kaniko-build": time="2026-01-22T22:53:21Z" level=info msg="Using caching version of cmd: RUN curl -fsSL https://github.com/mozilla/sccache/releases/download/v0.10.0/sccache-v0.10.0-x86_64-unknown-linux-musl.tar.gz | tar -zxf - --strip-components=1 && mkdir -p /usr/local/bin && mv sccache /usr/local/bin/sccache && chmod +x /usr/local/bin/sccache"
Step #0 - "kaniko-build": time="2026-01-22T22:53:21Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:79ee650d218f19110f90cb9de88fd5e4ee74f2a7a91181931a62110d0df85cef..."
Step #0 - "kaniko-build": time="2026-01-22T22:53:21Z" level=info msg="Using caching version of cmd: RUN dnf makecache && dnf install -y python3.10"
Step #0 - "kaniko-build": time="2026-01-22T22:53:21Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:0c7e2486758324fb2e8f2d1e95549f0cd9fb29b9c6cffb1a93f5fd98392e1c0b..."
Step #0 - "kaniko-build": time="2026-01-22T22:53:21Z" level=info msg="Using caching version of cmd: RUN /var/tmp/ci/install-cloud-sdk.sh"
Step #0 - "kaniko-build": time="2026-01-22T22:53:21Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:165d50d7b7d965b88cdb67065f72257581b2af3d1aa3a0965e36dc791b0fe235..."
Step #0 - "kaniko-build": time="2026-01-22T22:53:21Z" level=info msg="Using caching version of cmd: RUN ldconfig /usr/local/lib*"
Step #0 - "kaniko-build": time="2026-01-22T22:53:21Z" level=info msg="Unpacking rootfs as cmd COPY . /var/tmp/ci requires it."
Step #4 - "cancel-in-progress-builds-for-PR": 0cd08a207336: Verifying Checksum
Step #4 - "cancel-in-progress-builds-for-PR": 0cd08a207336: Download complete
Step #0 - "kaniko-build": time="2026-01-22T22:53:24Z" level=info msg="ARG NCPU=4"
Step #0 - "kaniko-build": time="2026-01-22T22:53:24Z" level=info msg="No files changed in this command, skipping snapshotting."
Step #0 - "kaniko-build": time="2026-01-22T22:53:24Z" level=info msg="RUN dnf makecache && dnf install -y cmake curl diffutils findutils gcc-c++ git make ninja-build patch tar unzip wget which zip"
Step #0 - "kaniko-build": time="2026-01-22T22:53:24Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-01-22T22:53:44Z" level=info msg="RUN dnf makecache && dnf install -y protobuf-compiler grpc-cpp"
Step #0 - "kaniko-build": time="2026-01-22T22:53:44Z" level=info msg="Found cached layer, extracting to filesystem"
Step #4 - "cancel-in-progress-builds-for-PR": 0cd08a207336: Pull complete
Step #0 - "kaniko-build": time="2026-01-22T22:53:47Z" level=info msg="RUN dnf makecache && dnf install -y c-ares-devel.i686 glibc-devel.i686 gmock-devel.i686 google-benchmark-devel.i686 grpc-devel.i686 gtest-devel.i686 libcurl-devel.i686 libstdc++-devel.i686 openssl-devel.i686 protobuf-devel.i686 re2-devel.i686 zlib-ng-compat-devel.i686"
Step #0 - "kaniko-build": time="2026-01-22T22:53:47Z" level=info msg="Found cached layer, extracting to filesystem"
Step #4 - "cancel-in-progress-builds-for-PR": 17dea0f58e0b: Pull complete
Step #4 - "cancel-in-progress-builds-for-PR": ee043545b88a: Pull complete
Step #4 - "cancel-in-progress-builds-for-PR": dce91e521e01: Pull complete
Step #4 - "cancel-in-progress-builds-for-PR": Digest: sha256:cac11b42359417e8f63d72ade43f8c92cf607aab65bfae2fac27b02b31c1fd02
Step #4 - "cancel-in-progress-builds-for-PR": Status: Downloaded newer image for gcr.io/google.com/cloudsdktool/cloud-sdk:latest
Step #4 - "cancel-in-progress-builds-for-PR": gcr.io/google.com/cloudsdktool/cloud-sdk:latest
Step #0 - "kaniko-build": time="2026-01-22T22:53:54Z" level=info msg="RUN dnf makecache && dnf install -y python3-devel"
Step #0 - "kaniko-build": time="2026-01-22T22:53:54Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-01-22T22:53:58Z" level=info msg="RUN pip3 install --upgrade pip"
Step #0 - "kaniko-build": time="2026-01-22T22:53:58Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-01-22T22:53:59Z" level=info msg="RUN pip3 install setuptools wheel"
Step #0 - "kaniko-build": time="2026-01-22T22:53:59Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-01-22T22:54:00Z" level=info msg="RUN dnf makecache && dnf install -y java-latest-openjdk-devel"
Step #0 - "kaniko-build": time="2026-01-22T22:54:00Z" level=info msg="Found cached layer, extracting to filesystem"
Step #4 - "cancel-in-progress-builds-for-PR": + test -z 15903
Step #4 - "cancel-in-progress-builds-for-PR": ++ gcloud builds describe --region us-east1 --format 'value(create_time)' 7d9b1a18-8dd0-4670-a2ae-15fe451fab56
Step #4 - "cancel-in-progress-builds-for-PR": + ctime=2026-01-22T22:51:48.634083Z
Step #4 - "cancel-in-progress-builds-for-PR": + query=tags=pr
Step #4 - "cancel-in-progress-builds-for-PR": + query+=' AND tags=15903'
Step #4 - "cancel-in-progress-builds-for-PR": + query+=' AND substitutions.COMMIT_SHA != 36ed552ca38ed6de3d08502b1b8b341fe3c98f6b'
Step #4 - "cancel-in-progress-builds-for-PR": + query+=' AND create_time < 2026-01-22T22:51:48.634083Z'
Step #4 - "cancel-in-progress-builds-for-PR": + gcloud builds list --region us-east1 --ongoing '--format=value(id)' --filter 'tags=pr AND tags=15903 AND substitutions.COMMIT_SHA != 36ed552ca38ed6de3d08502b1b8b341fe3c98f6b AND create_time < 2026-01-22T22:51:48.634083Z'
Step #4 - "cancel-in-progress-builds-for-PR": + xargs -r -t gcloud builds cancel --region us-east1
Step #4 - "cancel-in-progress-builds-for-PR": WARNING: The following filter keys were not present in any resource : create_time, substitutions.COMMIT_SHA, tags
Finished Step #4 - "cancel-in-progress-builds-for-PR"
Step #0 - "kaniko-build": time="2026-01-22T22:54:32Z" level=info msg="RUN echo 'root:cloudcxx' | chpasswd"
Step #0 - "kaniko-build": time="2026-01-22T22:54:32Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-01-22T22:54:32Z" level=info msg="WORKDIR /var/tmp/build/pkgconf"
Step #0 - "kaniko-build": time="2026-01-22T22:54:32Z" level=info msg="Cmd: workdir"
Step #0 - "kaniko-build": time="2026-01-22T22:54:32Z" level=info msg="Changed working directory to /var/tmp/build/pkgconf"
Step #0 - "kaniko-build": time="2026-01-22T22:54:32Z" level=info msg="Creating directory /var/tmp/build/pkgconf with uid -1 and gid -1"
Step #0 - "kaniko-build": time="2026-01-22T22:54:32Z" level=info msg="Taking snapshot of files..."
Step #0 - "kaniko-build": time="2026-01-22T22:54:32Z" level=info msg="RUN curl -fsSL https://distfiles.ariadne.space/pkgconf/pkgconf-2.2.0.tar.gz | tar -xzf - --strip-components=1 && ./configure --prefix=/usr --with-system-libdir=/lib64:/usr/lib64 --with-system-includedir=/usr/include && make -j ${NCPU:-4} && make install && ldconfig && cd /var/tmp && rm -fr build"
Step #0 - "kaniko-build": time="2026-01-22T22:54:32Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-01-22T22:54:32Z" level=info msg="ENV PKG_CONFIG_PATH=/usr/local/share/pkgconfig:/usr/lib/pkgconfig"
Step #0 - "kaniko-build": time="2026-01-22T22:54:32Z" level=info msg="No files changed in this command, skipping snapshotting."
Step #0 - "kaniko-build": time="2026-01-22T22:54:32Z" level=info msg="WORKDIR /var/tmp/build/json"
Step #0 - "kaniko-build": time="2026-01-22T22:54:32Z" level=info msg="Cmd: workdir"
Step #0 - "kaniko-build": time="2026-01-22T22:54:32Z" level=info msg="Changed working directory to /var/tmp/build/json"
Step #0 - "kaniko-build": time="2026-01-22T22:54:32Z" level=info msg="Creating directory /var/tmp/build/json with uid -1 and gid -1"
Step #0 - "kaniko-build": time="2026-01-22T22:54:32Z" level=info msg="Taking snapshot of files..."
Step #0 - "kaniko-build": time="2026-01-22T22:54:32Z" level=info msg="RUN curl -fsSL https://github.com/nlohmann/json/archive/v3.11.3.tar.gz | tar -xzf - --strip-components=1 && cmake -DCMAKE_BUILD_TYPE=Release -DBUILD_SHARED_LIBS=yes -DBUILD_TESTING=OFF -DJSON_BuildTests=OFF -S . -B cmake-out && cmake --build cmake-out --target install -- -j ${NCPU:-4} && ldconfig"
Step #0 - "kaniko-build": time="2026-01-22T22:54:32Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-01-22T22:54:33Z" level=info msg="WORKDIR /var/tmp/build/opentelemetry"
Step #0 - "kaniko-build": time="2026-01-22T22:54:33Z" level=info msg="Cmd: workdir"
Step #0 - "kaniko-build": time="2026-01-22T22:54:33Z" level=info msg="Changed working directory to /var/tmp/build/opentelemetry"
Step #0 - "kaniko-build": time="2026-01-22T22:54:33Z" level=info msg="Creating directory /var/tmp/build/opentelemetry with uid -1 and gid -1"
Step #0 - "kaniko-build": time="2026-01-22T22:54:33Z" level=info msg="Taking snapshot of files..."
Step #0 - "kaniko-build": time="2026-01-22T22:54:33Z" level=info msg="RUN curl -fsSL https://github.com/open-telemetry/opentelemetry-cpp/archive/v1.24.0.tar.gz | tar -xzf - --strip-components=1 && cmake -DCMAKE_CXX_STANDARD=17 -DCMAKE_CXX_COMPILER=g++ -DCMAKE_CXX_FLAGS=-m32 -DCMAKE_FIND_ROOT_PATH=/usr/ -DCMAKE_FIND_ROOT_PATH_MODE_PROGRAM=NEVER -DCMAKE_FIND_ROOT_PATH_MODE_LIBRARY=ONLY -DCMAKE_FIND_ROOT_PATH_MODE_INCLUDE=ONLY -DCMAKE_BUILD_TYPE=Release -DCMAKE_POSITION_INDEPENDENT_CODE=TRUE -DBUILD_SHARED_LIBS=ON -DWITH_EXAMPLES=OFF -DWITH_STL=CXX17 -DBUILD_TESTING=OFF -DOPENTELEMETRY_INSTALL=ON -DOPENTELEMETRY_ABI_VERSION_NO=2 -S . -B cmake-out && cmake --build cmake-out --target install -- -j ${NCPU:-4} && ldconfig"
Step #0 - "kaniko-build": time="2026-01-22T22:54:33Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-01-22T22:54:34Z" level=info msg="WORKDIR /var/tmp/sccache"
Step #0 - "kaniko-build": time="2026-01-22T22:54:34Z" level=info msg="Cmd: workdir"
Step #0 - "kaniko-build": time="2026-01-22T22:54:34Z" level=info msg="Changed working directory to /var/tmp/sccache"
Step #0 - "kaniko-build": time="2026-01-22T22:54:34Z" level=info msg="Creating directory /var/tmp/sccache with uid -1 and gid -1"
Step #0 - "kaniko-build": time="2026-01-22T22:54:34Z" level=info msg="Taking snapshot of files..."
Step #0 - "kaniko-build": time="2026-01-22T22:54:34Z" level=info msg="RUN curl -fsSL https://github.com/mozilla/sccache/releases/download/v0.10.0/sccache-v0.10.0-x86_64-unknown-linux-musl.tar.gz | tar -zxf - --strip-components=1 && mkdir -p /usr/local/bin && mv sccache /usr/local/bin/sccache && chmod +x /usr/local/bin/sccache"
Step #0 - "kaniko-build": time="2026-01-22T22:54:34Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-01-22T22:54:36Z" level=info msg="COPY . /var/tmp/ci"
Step #0 - "kaniko-build": time="2026-01-22T22:54:36Z" level=info msg="Taking snapshot of files..."
Step #0 - "kaniko-build": time="2026-01-22T22:54:36Z" level=info msg="WORKDIR /var/tmp/downloads"
Step #0 - "kaniko-build": time="2026-01-22T22:54:36Z" level=info msg="Cmd: workdir"
Step #0 - "kaniko-build": time="2026-01-22T22:54:36Z" level=info msg="Changed working directory to /var/tmp/downloads"
Step #0 - "kaniko-build": time="2026-01-22T22:54:36Z" level=info msg="Creating directory /var/tmp/downloads with uid -1 and gid -1"
Step #0 - "kaniko-build": time="2026-01-22T22:54:36Z" level=info msg="Taking snapshot of files..."
Step #0 - "kaniko-build": time="2026-01-22T22:54:36Z" level=info msg="RUN dnf makecache && dnf install -y python3.10"
Step #0 - "kaniko-build": time="2026-01-22T22:54:36Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-01-22T22:54:39Z" level=info msg="ENV CLOUDSDK_PYTHON=python3.10"
Step #0 - "kaniko-build": time="2026-01-22T22:54:39Z" level=info msg="No files changed in this command, skipping snapshotting."
Step #0 - "kaniko-build": time="2026-01-22T22:54:39Z" level=info msg="RUN /var/tmp/ci/install-cloud-sdk.sh"
Step #0 - "kaniko-build": time="2026-01-22T22:54:39Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-01-22T22:55:38Z" level=info msg="ENV CLOUD_SDK_LOCATION=/usr/local/google-cloud-sdk"
Step #0 - "kaniko-build": time="2026-01-22T22:55:38Z" level=info msg="No files changed in this command, skipping snapshotting."
Step #0 - "kaniko-build": time="2026-01-22T22:55:38Z" level=info msg="ENV PATH=${CLOUD_SDK_LOCATION}/bin:${PATH}"
Step #0 - "kaniko-build": time="2026-01-22T22:55:38Z" level=info msg="No files changed in this command, skipping snapshotting."
Step #0 - "kaniko-build": time="2026-01-22T22:55:38Z" level=info msg="RUN ldconfig /usr/local/lib*"
Step #0 - "kaniko-build": time="2026-01-22T22:55:38Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-01-22T22:55:38Z" level=info msg="Pushing image to us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32:7d9b1a18-8dd0-4670-a2ae-15fe451fab56"
Step #0 - "kaniko-build": time="2026-01-22T22:55:38Z" level=info msg="Pushed us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32@sha256:088f5e15093c312e531700460828d5ef0c65182f6148e3c0c2a3f2f350381f4c"
Finished Step #0 - "kaniko-build"
Starting Step #1 - "download-runner-image"
Step #1 - "download-runner-image": Pulling image: us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32:7d9b1a18-8dd0-4670-a2ae-15fe451fab56
Step #1 - "download-runner-image": 7d9b1a18-8dd0-4670-a2ae-15fe451fab56: Pulling from cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32
Step #1 - "download-runner-image": fc4518b6c5f9: Pulling fs layer
Step #1 - "download-runner-image": e01afd4bc321: Pulling fs layer
Step #1 - "download-runner-image": e1fb24e480d3: Pulling fs layer
Step #1 - "download-runner-image": 95583f8b0ebb: Pulling fs layer
Step #1 - "download-runner-image": d893c6df39a8: Pulling fs layer
Step #1 - "download-runner-image": 4d4eb24ac8fd: Pulling fs layer
Step #1 - "download-runner-image": ab59fe1bd1b5: Pulling fs layer
Step #1 - "download-runner-image": 63de0729e3ed: Pulling fs layer
Step #1 - "download-runner-image": e966bc4ee20c: Pulling fs layer
Step #1 - "download-runner-image": e124f0f055c7: Pulling fs layer
Step #1 - "download-runner-image": 26200d909c35: Pulling fs layer
Step #1 - "download-runner-image": e4124e706979: Pulling fs layer
Step #1 - "download-runner-ima
...
[Logs truncated due to log size limitations. For full logs, see https://storage.cloud.google.com/cloud-cpp-community-publiclogs/logs/google-cloud-cpp/15903/36ed552ca38ed6de3d08502b1b8b341fe3c98f6b/fedora-m32-m32-__default__/log-7d9b1a18-8dd0-4670-a2ae-15fe451fab56.txt.]
...
ey: "row-key-1"
Step #2 - "build.sh": family_name {
Step #2 - "build.sh": value: "family4"
Step #2 - "build.sh": }
Step #2 - "build.sh": qualifier {
Step #2 - "build.sh": value: "c2"
Step #2 - "build.sh": }
Step #2 - "build.sh": timestamp_micros: 1000
Step #2 - "build.sh": value: "data2"
Step #2 - "build.sh": commit_row: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:64)
Step #2 - "build.sh": 2026-01-22T23:02:15.892785499Z [DEBUG] <4125971328> Read(59)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-01-22T23:02:15.892819919Z [DEBUG] <4125971328> Read(59)() >> google.bigtable.v2.ReadRowsResponse {
Step #2 - "build.sh": chunks {
Step #2 - "build.sh": row_key: "row-key-2"
Step #2 - "build.sh": family_name {
Step #2 - "build.sh": value: "family4"
Step #2 - "build.sh": }
Step #2 - "build.sh": qualifier {
Step #2 - "build.sh": value: "c1"
Step #2 - "build.sh": }
Step #2 - "build.sh": timestamp_micros: 1000
Step #2 - "build.sh": value: "data3"
Step #2 - "build.sh": commit_row: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:64)
Step #2 - "build.sh": 2026-01-22T23:02:15.892838529Z [DEBUG] <4125971328> Read(59)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-01-22T23:02:15.892862949Z [DEBUG] <4125971328> Read(59)() >> OK (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:62)
Step #2 - "build.sh": 2026-01-22T23:02:15.893290189Z [DEBUG] <4125971328> ReadRows() << google.bigtable.v2.ReadRowsRequest {
Step #2 - "build.sh": table_name: "projects/cloud-cpp-testing-resources/instances/test-instance/tables/tbl-2026-01-22-0asgygwsqss70h12bskc7vpwe2cfutghq7"
Step #2 - "build.sh": rows {
Step #2 - "build.sh": row_ranges {
Step #2 - "build.sh": }
Step #2 - "build.sh": }
Step #2 - "build.sh": filter {
Step #2 - "build.sh": pass_all_filter: true
Step #2 - "build.sh": }
Step #2 - "build.sh": rows_limit: 2
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.cc:24)
Step #2 - "build.sh": 2026-01-22T23:02:15.894363899Z [DEBUG] <4125971328> ReadRows() >> not null (/workspace/google/cloud/internal/log_wrapper.cc:53)
Step #2 - "build.sh": 2026-01-22T23:02:15.894379679Z [DEBUG] <4125971328> Read(60)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-01-22T23:02:15.894959628Z [DEBUG] <4125971328> Read(60)() >> google.bigtable.v2.ReadRowsResponse {
Step #2 - "build.sh": chunks {
Step #2 - "build.sh": row_key: "row-key-1"
Step #2 - "build.sh": family_name {
Step #2 - "build.sh": value: "family4"
Step #2 - "build.sh": }
Step #2 - "build.sh": qualifier {
Step #2 - "build.sh": value: "c1"
Step #2 - "build.sh": }
Step #2 - "build.sh": timestamp_micros: 1000
Step #2 - "build.sh": value: "data1"
Step #2 - "build.sh": }
Step #2 - "build.sh": chunks {
Step #2 - "build.sh": row_key: "row-key-1"
Step #2 - "build.sh": family_name {
Step #2 - "build.sh": value: "family4"
Step #2 - "build.sh": }
Step #2 - "build.sh": qualifier {
Step #2 - "build.sh": value: "c2"
Step #2 - "build.sh": }
Step #2 - "build.sh": timestamp_micros: 1000
Step #2 - "build.sh": value: "data2"
Step #2 - "build.sh": commit_row: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:64)
Step #2 - "build.sh": 2026-01-22T23:02:15.895000958Z [DEBUG] <4125971328> Read(60)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-01-22T23:02:15.895031798Z [DEBUG] <4125971328> Read(60)() >> google.bigtable.v2.ReadRowsResponse {
Step #2 - "build.sh": chunks {
Step #2 - "build.sh": row_key: "row-key-2"
Step #2 - "build.sh": family_name {
Step #2 - "build.sh": value: "family4"
Step #2 - "build.sh": }
Step #2 - "build.sh": qualifier {
Step #2 - "build.sh": value: "c1"
Step #2 - "build.sh": }
Step #2 - "build.sh": timestamp_micros: 1000
Step #2 - "build.sh": value: "data3"
Step #2 - "build.sh": commit_row: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:64)
Step #2 - "build.sh": 2026-01-22T23:02:15.895050388Z [DEBUG] <4125971328> Read(60)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-01-22T23:02:15.895073408Z [DEBUG] <4125971328> Read(60)() >> OK (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:62)
Step #2 - "build.sh": 2026-01-22T23:02:15.895581728Z [DEBUG] <4125971328> ReadRows() << google.bigtable.v2.ReadRowsRequest {
Step #2 - "build.sh": table_name: "projects/cloud-cpp-testing-resources/instances/test-instance/tables/tbl-2026-01-22-0asgygwsqss70h12bskc7vpwe2cfutghq7"
Step #2 - "build.sh": rows {
Step #2 - "build.sh": row_ranges {
Step #2 - "build.sh": start_key_closed: "row-key-1"
Step #2 - "build.sh": end_key_closed: "row-key-2"
Step #2 - "build.sh": }
Step #2 - "build.sh": }
Step #2 - "build.sh": filter {
Step #2 - "build.sh": pass_all_filter: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.cc:24)
Step #2 - "build.sh": 2026-01-22T23:02:15.896883968Z [DEBUG] <4125971328> ReadRows() >> not null (/workspace/google/cloud/internal/log_wrapper.cc:53)
Step #2 - "build.sh": 2026-01-22T23:02:15.896898178Z [DEBUG] <4125971328> Read(61)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-01-22T23:02:15.897258268Z [DEBUG] <4125971328> Read(61)() >> google.bigtable.v2.ReadRowsResponse {
Step #2 - "build.sh": chunks {
Step #2 - "build.sh": row_key: "row-key-1"
Step #2 - "build.sh": family_name {
Step #2 - "build.sh": value: "family4"
Step #2 - "build.sh": }
Step #2 - "build.sh": qualifier {
Step #2 - "build.sh": value: "c1"
Step #2 - "build.sh": }
Step #2 - "build.sh": timestamp_micros: 1000
Step #2 - "build.sh": value: "data1"
Step #2 - "build.sh": }
Step #2 - "build.sh": chunks {
Step #2 - "build.sh": row_key: "row-key-1"
Step #2 - "build.sh": family_name {
Step #2 - "build.sh": value: "family4"
Step #2 - "build.sh": }
Step #2 - "build.sh": qualifier {
Step #2 - "build.sh": value: "c2"
Step #2 - "build.sh": }
Step #2 - "build.sh": timestamp_micros: 1000
Step #2 - "build.sh": value: "data2"
Step #2 - "build.sh": commit_row: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:64)
Step #2 - "build.sh": 2026-01-22T23:02:15.897298598Z [DEBUG] <4125971328> Read(61)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-01-22T23:02:15.897336578Z [DEBUG] <4125971328> Read(61)() >> google.bigtable.v2.ReadRowsResponse {
Step #2 - "build.sh": chunks {
Step #2 - "build.sh": row_key: "row-key-2"
Step #2 - "build.sh": family_name {
Step #2 - "build.sh": value: "family4"
Step #2 - "build.sh": }
Step #2 - "build.sh": qualifier {
Step #2 - "build.sh": value: "c1"
Step #2 - "build.sh": }
Step #2 - "build.sh": timestamp_micros: 1000
Step #2 - "build.sh": value: "data3"
Step #2 - "build.sh": commit_row: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:64)
Step #2 - "build.sh": 2026-01-22T23:02:15.897356038Z [DEBUG] <4125971328> Read(61)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-01-22T23:02:15.897412838Z [DEBUG] <4125971328> Read(61)() >> OK (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:62)
Step #2 - "build.sh": 2026-01-22T23:02:15.902692987Z [INFO] <4125971328> Enabled logging for gRPC calls (/workspace/google/cloud/bigtable/internal/bigtable_stub_factory.cc:104)
Step #2 - "build.sh": 2026-01-22T23:02:15.903467606Z [INFO] <4125971328> Enabled logging for gRPC calls (/workspace/google/cloud/monitoring/v3/internal/metric_stub_factory.cc:57)
Step #2 - "build.sh": 2026-01-22T23:02:15.903653067Z [INFO] <4077906752> Cloud Monitoring Export skipped. No data. (/workspace/google/cloud/opentelemetry/internal/monitoring_exporter.cc:132)
Step #2 - "build.sh": 2026-01-22T23:02:15.903857536Z [DEBUG] <4125971328> ReadRows() << google.bigtable.v2.ReadRowsRequest {
Step #2 - "build.sh": table_name: "projects/cloud-cpp-testing-resources/instances/test-instance/tables/tbl-2026-01-22-0asgygwsqss70h12bskc7vpwe2cfutghq7"
Step #2 - "build.sh": rows {
Step #2 - "build.sh": row_ranges {
Step #2 - "build.sh": }
Step #2 - "build.sh": }
Step #2 - "build.sh": filter {
Step #2 - "build.sh": pass_all_filter: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.cc:24)
Step #2 - "build.sh": 2026-01-22T23:02:15.905203826Z [DEBUG] <4125971328> ReadRows() >> not null (/workspace/google/cloud/internal/log_wrapper.cc:53)
Step #2 - "build.sh": 2026-01-22T23:02:15.905220036Z [DEBUG] <4125971328> Read(62)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-01-22T23:02:15.905923216Z [DEBUG] <4125971328> Read(62)() >> google.bigtable.v2.ReadRowsResponse {
Step #2 - "build.sh": chunks {
Step #2 - "build.sh": row_key: "row-key-1"
Step #2 - "build.sh": family_name {
Step #2 - "build.sh": value: "family4"
Step #2 - "build.sh": }
Step #2 - "build.sh": qualifier {
Step #2 - "build.sh": value: "c1"
Step #2 - "build.sh": }
Step #2 - "build.sh": timestamp_micros: 1000
Step #2 - "build.sh": value: "data1"
Step #2 - "build.sh": }
Step #2 - "build.sh": chunks {
Step #2 - "build.sh": row_key: "row-key-1"
Step #2 - "build.sh": family_name {
Step #2 - "build.sh": value: "family4"
Step #2 - "build.sh": }
Step #2 - "build.sh": qualifier {
Step #2 - "build.sh": value: "c2"
Step #2 - "build.sh": }
Step #2 - "build.sh": timestamp_micros: 1000
Step #2 - "build.sh": value: "data2"
Step #2 - "build.sh": commit_row: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:64)
Step #2 - "build.sh": 2026-01-22T23:02:15.905968826Z [DEBUG] <4125971328> Read(62)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-01-22T23:02:15.906004906Z [DEBUG] <4125971328> Read(62)() >> google.bigtable.v2.ReadRowsResponse {
Step #2 - "build.sh": chunks {
Step #2 - "build.sh": row_key: "row-key-2"
Step #2 - "build.sh": family_name {
Step #2 - "build.sh": value: "family4"
Step #2 - "build.sh": }
Step #2 - "build.sh": qualifier {
Step #2 - "build.sh": value: "c1"
Step #2 - "build.sh": }
Step #2 - "build.sh": timestamp_micros: 1000
Step #2 - "build.sh": value: "data3"
Step #2 - "build.sh": commit_row: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:64)
Step #2 - "build.sh": 2026-01-22T23:02:15.906020416Z [DEBUG] <4125971328> Read(62)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-01-22T23:02:15.906040936Z [DEBUG] <4125971328> Read(62)() >> google.bigtable.v2.ReadRowsResponse {
Step #2 - "build.sh": chunks {
Step #2 - "build.sh": row_key: "row-key-3"
Step #2 - "build.sh": family_name {
Step #2 - "build.sh": value: "family4"
Step #2 - "build.sh": }
Step #2 - "build.sh": qualifier {
Step #2 - "build.sh": value: "c1"
Step #2 - "build.sh": }
Step #2 - "build.sh": timestamp_micros: 1000
Step #2 - "build.sh": value: "data4"
Step #2 - "build.sh": commit_row: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:64)
Step #2 - "build.sh": 2026-01-22T23:02:15.906053966Z [DEBUG] <4125971328> Read(62)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-01-22T23:02:15.906088536Z [DEBUG] <4125971328> Read(62)() >> OK (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:62)
Step #2 - "build.sh": 2026-01-22T23:02:15.906772726Z [INFO] <4125971328> Enabled logging for gRPC calls (/workspace/google/cloud/bigtable/admin/internal/bigtable_table_admin_stub_factory.cc:59)
Step #2 - "build.sh": 2026-01-22T23:02:15.906843366Z [DEBUG] <4125971328> DropRowRange() << google.bigtable.admin.v2.DropRowRangeRequest {
Step #2 - "build.sh": name: "projects/cloud-cpp-testing-resources/instances/test-instance/tables/tbl-2026-01-22-0asgygwsqss70h12bskc7vpwe2cfutghq7"
Step #2 - "build.sh": delete_all_data_from_table: true
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.cc:24)
Step #2 - "build.sh": 2026-01-22T23:02:15.908573656Z [DEBUG] <4125971328> DropRowRange() >> status=OK (/workspace/google/cloud/internal/log_wrapper.cc:29)
Step #2 - "build.sh": 2026-01-22T23:02:15.909527315Z [DEBUG] <4125971328> MutateRows() << google.bigtable.v2.MutateRowsRequest {
Step #2 - "build.sh": table_name: "projects/cloud-cpp-testing-resources/instances/test-instance/tables/tbl-2026-01-22-0asgygwsqss70h12bskc7vpwe2cfutghq7"
Step #2 - "build.sh": entries {
Step #2 - "build.sh": row_key: "row-key-1"
Step #2 - "build.sh": mutations {
Step #2 - "build.sh": set_cell {
Step #2 - "build.sh": family_name: "family4"
Step #2 - "build.sh": column_qualifier: "c1"
Step #2 - "build.sh": timestamp_micros: 1000
Step #2 - "build.sh": value: "v1000"
Step #2 - "build.sh": }
Step #2 - "build.sh": }
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.cc:24)
Step #2 - "build.sh": 2026-01-22T23:02:15.910966065Z [DEBUG] <4125971328> MutateRows() >> not null (/workspace/google/cloud/internal/log_wrapper.cc:53)
Step #2 - "build.sh": 2026-01-22T23:02:15.910976855Z [DEBUG] <4125971328> Read(63)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-01-22T23:02:15.911323635Z [DEBUG] <4125971328> Read(63)() >> google.bigtable.v2.MutateRowsResponse {
Step #2 - "build.sh": entries {
Step #2 - "build.sh": status {
Step #2 - "build.sh": }
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:64)
Step #2 - "build.sh": 2026-01-22T23:02:15.911338045Z [DEBUG] <4125971328> Read(63)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-01-22T23:02:15.911386615Z [DEBUG] <4125971328> Read(63)() >> OK (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:62)
Step #2 - "build.sh": 2026-01-22T23:02:15.911889805Z [DEBUG] <4125971328> ReadRows() << google.bigtable.v2.ReadRowsRequest {
Step #2 - "build.sh": table_name: "projects/cloud-cpp-testing-resources/instances/test-instance/tables/tbl-2026-01-22-0asgygwsqss70h12bskc7vpwe2cfutghq7"
Step #2 - "build.sh": rows {
Step #2 - "build.sh": row_keys: "row-key-2"
Step #2 - "build.sh": }
Step #2 - "build.sh": filter {
Step #2 - "build.sh": pass_all_filter: true
Step #2 - "build.sh": }
Step #2 - "build.sh": rows_limit: 1
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.cc:24)
Step #2 - "build.sh": 2026-01-22T23:02:15.912903275Z [DEBUG] <4125971328> ReadRows() >> not null (/workspace/google/cloud/internal/log_wrapper.cc:53)
Step #2 - "build.sh": 2026-01-22T23:02:15.912914485Z [DEBUG] <4125971328> Read(64)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-01-22T23:02:15.913203995Z [DEBUG] <4125971328> Read(64)() >> OK (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:62)
Step #2 - "build.sh": 2026-01-22T23:02:15.917571154Z [INFO] <4125971328> Enabled logging for gRPC calls (/workspace/google/cloud/bigtable/internal/bigtable_stub_factory.cc:104)
Step #2 - "build.sh": 2026-01-22T23:02:15.918021404Z [INFO] <4125971328> Enabled logging for gRPC calls (/workspace/google/cloud/monitoring/v3/internal/metric_stub_factory.cc:57)
Step #2 - "build.sh": 2026-01-22T23:02:15.918146614Z [INFO] <4077906752> Cloud Monitoring Export skipped. No data. (/workspace/google/cloud/opentelemetry/internal/monitoring_exporter.cc:132)
Step #2 - "build.sh": 2026-01-22T23:02:15.918306844Z [DEBUG] <4125971328> ReadRows() << google.bigtable.v2.ReadRowsRequest {
Step #2 - "build.sh": table_name: "projects/cloud-cpp-testing-resources/instances/test-instance/tables/tbl-2026-01-22-0asgygwsqss70h12bskc7vpwe2cfutghq7"
Step #2 - "build.sh": rows {
Step #2 - "build.sh": row_ranges {
Step #2 - "build.sh": }
Step #2 - "build.sh": }
Step #2 - "build.sh": filter {
Step #2 - "build.sh": pass_all_filter: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.cc:24)
Step #2 - "build.sh": 2026-01-22T23:02:15.919305444Z [DEBUG] <4125971328> ReadRows() >> not null (/workspace/google/cloud/internal/log_wrapper.cc:53)
Step #2 - "build.sh": 2026-01-22T23:02:15.919319004Z [DEBUG] <4125971328> Read(65)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-01-22T23:02:15.919743003Z [DEBUG] <4125971328> Read(65)() >> google.bigtable.v2.ReadRowsResponse {
Step #2 - "build.sh": chunks {
Step #2 - "build.sh": row_key: "row-key-1"
Step #2 - "build.sh": family_name {
Step #2 - "build.sh": value: "family4"
Step #2 - "build.sh": }
Step #2 - "build.sh": qualifier {
Step #2 - "build.sh": value: "c1"
Step #2 - "build.sh": }
Step #2 - "build.sh": timestamp_micros: 1000
Step #2 - "build.sh": value: "v1000"
Step #2 - "build.sh": commit_row: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:64)
Step #2 - "build.sh": 2026-01-22T23:02:15.919777983Z [DEBUG] <4125971328> Read(65)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-01-22T23:02:15.919814554Z [DEBUG] <4125971328> Read(65)() >> OK (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:62)
Step #2 - "build.sh": 2026-01-22T23:02:15.920497363Z [INFO] <4125971328> Enabled logging for gRPC calls (/workspace/google/cloud/bigtable/admin/internal/bigtable_table_admin_stub_factory.cc:59)
Step #2 - "build.sh": 2026-01-22T23:02:15.920570003Z [DEBUG] <4125971328> DropRowRange() << google.bigtable.admin.v2.DropRowRangeRequest {
Step #2 - "build.sh": name: "projects/cloud-cpp-testing-resources/instances/test-instance/tables/tbl-2026-01-22-0asgygwsqss70h12bskc7vpwe2cfutghq7"
Step #2 - "build.sh": delete_all_data_from_table: true
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.cc:24)
Step #2 - "build.sh": 2026-01-22T23:02:15.922035433Z [DEBUG] <4125971328> DropRowRange() >> status=OK (/workspace/google/cloud/internal/log_wrapper.cc:29)
Step #2 - "build.sh": 2026-01-22T23:02:15.926389402Z [INFO] <4125971328> Enabled logging for gRPC calls (/workspace/google/cloud/bigtable/internal/bigtable_stub_factory.cc:104)
Step #2 - "build.sh": 2026-01-22T23:02:15.926815212Z [INFO] <4125971328> Enabled logging for gRPC calls (/workspace/google/cloud/monitoring/v3/internal/metric_stub_factory.cc:57)
Step #2 - "build.sh": 2026-01-22T23:02:15.926991611Z [INFO] <4077906752> Cloud Monitoring Export skipped. No data. (/workspace/google/cloud/opentelemetry/internal/monitoring_exporter.cc:132)
Step #2 - "build.sh": 2026-01-22T23:02:15.927127742Z [DEBUG] <4125971328> ReadRows() << google.bigtable.v2.ReadRowsRequest {
Step #2 - "build.sh": table_name: "projects/cloud-cpp-testing-resources/instances/test-instance/tables/tbl-2026-01-22-0asgygwsqss70h12bskc7vpwe2cfutghq7"
Step #2 - "build.sh": rows {
Step #2 - "build.sh": row_ranges {
Step #2 - "build.sh": }
Step #2 - "build.sh": }
Step #2 - "build.sh": filter {
Step #2 - "build.sh": pass_all_filter: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.cc:24)
Step #2 - "build.sh": 2026-01-22T23:02:15.928196041Z [DEBUG] <4125971328> ReadRows() >> not null (/workspace/google/cloud/internal/log_wrapper.cc:53)
Step #2 - "build.sh": 2026-01-22T23:02:15.928212712Z [DEBUG] <4125971328> Read(66)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-01-22T23:02:15.928539031Z [DEBUG] <4125971328> Read(66)() >> OK (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:62)
Step #2 - "build.sh": 2026-01-22T23:02:15.929125741Z [INFO] <4125971328> Enabled logging for gRPC calls (/workspace/google/cloud/bigtable/admin/internal/bigtable_table_admin_stub_factory.cc:59)
Step #2 - "build.sh": 2026-01-22T23:02:15.929204831Z [DEBUG] <4125971328> DropRowRange() << google.bigtable.admin.v2.DropRowRangeRequest {
Step #2 - "build.sh": name: "projects/cloud-cpp-testing-resources/instances/test-instance/tables/tbl-2026-01-22-0asgygwsqss70h12bskc7vpwe2cfutghq7"
Step #2 - "build.sh": delete_all_data_from_table: true
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.cc:24)
Step #2 - "build.sh": 2026-01-22T23:02:15.929983041Z [DEBUG] <4125971328> DropRowRange() >> status=UNAVAILABLE: failed to connect to all addresses; last error: UNKNOWN: Invalid argument (/workspace/google/cloud/internal/log_wrapper.cc:29)
Step #2 - "build.sh": [ FAILED ] DataIntegrationTest.TableReadModifyWriteRowMultipleTest (16 ms)
Step #2 - "build.sh": [ RUN ] DataIntegrationTest.SingleColumnQuery
Step #2 - "build.sh": /workspace/google/cloud/bigtable/tests/data_integration_test.cc:813: Skipped
Step #2 - "build.sh":
Step #2 - "build.sh":
Step #2 - "build.sh": [ SKIPPED ] DataIntegrationTest.SingleColumnQuery (11 ms)
Step #2 - "build.sh": [----------] 25 tests from DataIntegrationTest (2192 ms total)
Step #2 - "build.sh":
Step #2 - "build.sh": [----------] Global test environment tear-down
Step #2 - "build.sh": [==========] 25 tests from 1 test suite ran. (2202 ms total)
Step #2 - "build.sh": [ PASSED ] 17 tests.
Step #2 - "build.sh": [ SKIPPED ] 7 tests, listed below:
Step #2 - "build.sh": [ SKIPPED ] DataIntegrationTest.TableReadRowsReverseScan
Step #2 - "build.sh": [ SKIPPED ] DataIntegrationTest.ClientQueryColumnFamily
Step #2 - "build.sh": [ SKIPPED ] DataIntegrationTest.ClientQueryColumnFamilyWithHistory
Step #2 - "build.sh": [ SKIPPED ] DataIntegrationTest.QueryWithNulls
Step #2 - "build.sh": [ SKIPPED ] DataIntegrationTest.SingleColumnQueryWithHistory
Step #2 - "build.sh": [ SKIPPED ] DataIntegrationTest.MultiColumnQuery
Step #2 - "build.sh": [ SKIPPED ] DataIntegrationTest.SingleColumnQuery
Step #2 - "build.sh": [ FAILED ] 1 test, listed below:
Step #2 - "build.sh": [ FAILED ] DataIntegrationTest.TableReadModifyWriteRowMultipleTest
Step #2 - "build.sh":
Step #2 - "build.sh": 1 FAILED TEST
Step #2 - "build.sh":
Step #2 - "build.sh": Start 688: bigtable_data_integration_test
Step #2 - "build.sh": 10/13 Test #689: bigtable_filters_integration_test ................ Passed 2.73 sec
Step #2 - "build.sh": Test #688: bigtable_data_integration_test ................... Passed 1.73 sec
Step #2 - "build.sh": 12/13 Test #683: bigtable_scan_throughput_benchmark ............... Passed 7.45 sec
Step #2 - "build.sh": 13/13 Test #682: bigtable_scan_async_throughput_benchmark ......... Passed 8.76 sec
Step #2 - "build.sh":
Step #2 - "build.sh": 100% tests passed, 0 tests failed out of 13
Step #2 - "build.sh":
Step #2 - "build.sh": Label Time Summary:
Step #2 - "build.sh": integration-test = 26.70 sec*proc (13 tests)
Step #2 - "build.sh": integration-test-emulator = 26.70 sec*proc (13 tests)
Step #2 - "build.sh":
Step #2 - "build.sh": Total Test time (real) = 8.79 sec
Step #2 - "build.sh": 2026-01-22T23:02:22Z (+357s): Killing Bigtable Emulators...
Step #2 - "build.sh": EMULATOR_PID=238640 .+.
Step #2 - "build.sh": INSTANCE_ADMIN_EMULATOR_PID=238641 .+.
Step #2 - "build.sh": ================ emulator.log ================
Step #2 - "build.sh": cat: /h/emulator.log: No such file or directory
Step #2 - "build.sh": 1 Cloud Bigtable emulator running on 127.0.0.1:8480
Step #2 - "build.sh": ================ emulator.log ================
Step #2 - "build.sh": cat: /h/emulator.log: No such file or directory
Step #2 - "build.sh": 1 Cloud Bigtable emulator running on 127.0.0.1:8480
Step #2 - "build.sh": ================ emulator.log ================
Step #2 - "build.sh": ================ instance-admin-emulator.log ================
Step #2 - "build.sh": cat: /h/instance-admin-emulator.log: No such file or directory
Step #2 - "build.sh": 1 env -C /workspace /workspace/cmake-out/google/cloud/bigtable/tests/instance_admin_emulator 8490
Step #2 - "build.sh": 2 Cloud Bigtable emulator running on localhost:8490
Step #2 - "build.sh": 3 ListInstances() request=parent: "projects/emulated"
Step #2 - "build.sh": 4
Step #2 - "build.sh": 5 CreateInstance() request=parent: "projects/cloud-cpp-testing-resources"
Step #2 - "build.sh": 6 instance_id: "it-2026-01-22-fnhkpywnmmlw"
Step #2 - "build.sh": 7 instance {
Step #2 - "build.sh": 8 display_name: "IT it-2026-01-22-fnhkpywnmmlw"
Step #2 - "build.sh": 9 type: PRODUCTION
Step #2 - "build.sh": 10 }
Step #2 - "build.sh": ================ instance-admin-emulator.log ================
Step #2 - "build.sh": cat: /h/instance-admin-emulator.log: No such file or directory
Step #2 - "build.sh": 231 GetCluster() request=name: "projects/cloud-cpp-testing-resources/instances/it-2026-01-22-zdhflhc68mqk/clusters/it-2026-01-22-zdhflhc68mqk-cl2"
Step #2 - "build.sh": 232
Step #2 - "build.sh": 233 UpdateCluster() request=name: "projects/cloud-cpp-testing-resources/instances/it-2026-01-22-zdhflhc68mqk/clusters/it-2026-01-22-zdhflhc68mqk-cl2"
Step #2 - "build.sh": 234 location: "projects/cloud-cpp-testing-resources/locations/us-central1-c"
Step #2 - "build.sh": 235 serve_nodes: 4
Step #2 - "build.sh": 236 default_storage_type: HDD
Step #2 - "build.sh": 237
Step #2 - "build.sh": 238 GetCluster() request=name: "projects/cloud-cpp-testing-resources/instances/it-2026-01-22-zdhflhc68mqk/clusters/it-2026-01-22-zdhflhc68mqk-cl2"
Step #2 - "build.sh": 239
Step #2 - "build.sh": 240 DeleteCluster() request=name: "projects/cloud-cpp-testing-resources/instances/it-2026-01-22-zdhflhc68mqk/clusters/it-2================ instance-admin-emulator.log ================
Step #2 - "build.sh":
Step #2 - "build.sh": 2026-01-22T23:02:22Z (+357s)
Step #2 - "build.sh": ------------------------------------------------------
Step #2 - "build.sh": | Running REST integration tests (with emulator) |
Step #2 - "build.sh": ------------------------------------------------------
Step #2 - "build.sh": 2026-01-22T23:02:22Z (+357s): Launching Cloud Storage emulator on port 0
Step #2 - "build.sh": Successfully connected to emulator [239433]
Step #2 - "build.sh": 2026-01-22T23:02:23Z (+358s): Successfully connected to gRPC server at port 41055
Step #2 - "build.sh": Test project /workspace/cmake-out
Step #2 - "build.sh": Start 173: common_internal_curl_rest_client_integration_test
Step #2 - "build.sh": 1/1 Test #173: common_internal_curl_rest_client_integration_test ... Passed 0.45 sec
Step #2 - "build.sh":
Step #2 - "build.sh": 100% tests passed, 0 tests failed out of 1
Step #2 - "build.sh":
Step #2 - "build.sh": Label Time Summary:
Step #2 - "build.sh": integration-test = 0.45 sec*proc (1 test)
Step #2 - "build.sh": integration-test-emulator = 0.45 sec*proc (1 test)
Step #2 - "build.sh":
Step #2 - "build.sh": Total Test time (real) = 0.48 sec
Step #2 - "build.sh": Killing emulator server [239433] ... done.
Step #2 - "build.sh": 2026-01-22T23:02:25Z (+360s): ===> sccache stats
Step #2 - "build.sh": Compile requests 5787
Step #2 - "build.sh": Compile requests executed 5787
Step #2 - "build.sh": Cache hits 5787
Step #2 - "build.sh": Cache hits (C/C++) 5787
Step #2 - "build.sh": Cache misses 0
Step #2 - "build.sh": Cache hits rate 100.00 %
Step #2 - "build.sh": Cache hits rate (C/C++) 100.00 %
Step #2 - "build.sh": Cache timeouts 0
Step #2 - "build.sh": Cache read errors 0
Step #2 - "build.sh": Forced recaches 0
Step #2 - "build.sh": Cache write errors 0
Step #2 - "build.sh": Cache errors 0
Step #2 - "build.sh": Compilations 0
Step #2 - "build.sh": Compilation failures 0
Step #2 - "build.sh": Non-cacheable compilations 0
Step #2 - "build.sh": Non-cacheable calls 0
Step #2 - "build.sh": Non-compilation calls 0
Step #2 - "build.sh": Unsupported compiler calls 0
Step #2 - "build.sh": Average cache write 0.000 s
Step #2 - "build.sh": Average compiler 0.000 s
Step #2 - "build.sh": Average cache read hit 0.067 s
Step #2 - "build.sh": Failed distributed compilations 0
Step #2 - "build.sh": Cache location gcs, name: cloud-cpp-testing-resources_cloudbuild, prefix: /sccache/fedora-m32-m32/
Step #2 - "build.sh": Version (client) 0.10.0
Step #2 - "build.sh": ==> 🕑 m32 completed in 359.467 seconds
Finished Step #2 - "build.sh"
Starting Step #3 - "remove-image"
Step #3 - "remove-image": Already have image (with digest): gcr.io/google.com/cloudsdktool/cloud-sdk
Step #3 - "remove-image": Digests:
Step #3 - "remove-image": - us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32@sha256:088f5e15093c312e531700460828d5ef0c65182f6148e3c0c2a3f2f350381f4c
Step #3 - "remove-image":
Step #3 - "remove-image": Tags:
Step #3 - "remove-image": - us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32:7d9b1a18-8dd0-4670-a2ae-15fe451fab56
Step #3 - "remove-image": Delete request issued.
Step #3 - "remove-image": Waiting for operation [projects/cloud-cpp-testing-resources/locations/us-east1/operations/b4888263-b21f-492d-b799-9bc16c4a4487] to complete...
Step #3 - "remove-image": .....done.
Finished Step #3 - "remove-image"
PUSH
DONE
Loading