diff --git a/serverless-fleets/README.md b/serverless-fleets/README.md
index 7f3fbc68..72dcf305 100644
--- a/serverless-fleets/README.md
+++ b/serverless-fleets/README.md
@@ -46,7 +46,7 @@ A fleet consists of a collection of worker nodes that automatically scale up or
Like applications, jobs, and functions, fleets run within a Code Engine project. A project is a grouping of Code Engine resources within a specific IBM Cloud region. Projects are used to organize resources and manage access to entities such as configmaps, secrets, and persistent data stores.
-## Architecture
+## Architecture
The architecture used in this tutorial looks as follows.
@@ -75,7 +75,7 @@ The tutorial has been tested on a MacOS and Ubuntu24 client machine with the fol
Clone this repository
```
-git clone https://github.com/IBM/CodeEngine.git
+git clone https://github.com/IBM/CodeEngine.git
```
Switch to the `serverless-fleets` directory, which will be the root directory for all steps of this tutorial
@@ -298,7 +298,7 @@ Run a serverless fleet to process 100 tasks where each tasks gets 1 CPU and 2 GB
-In the fleet details you will see 5 workers being provisined. The number of workers is determined by the profile, cpu/memory and number of parallel tasks.
+In the fleet details you will see 5 workers being provisined. The number of workers is determined by the profile, cpu/memory and number of parallel tasks.
```
ibmcloud ce fleet get --id
@@ -416,7 +416,7 @@ The 6 tasks are submitted using the `tasks-from-local-file` option using the [wo

-The example mounts the [Persistant Data Stores](https://cloud.ibm.com/docs/codeengine?topic=codeengine-persistent-data-store) (PDS) to the container using the `--mount-data-store MOUNT_DIRECTORY=STORAGE_NAME:[SUBPATH]`, where
+The example mounts the [Persistant Data Stores](https://cloud.ibm.com/docs/codeengine?topic=codeengine-persistent-data-store) (PDS) to the container using the `--mount-data-store MOUNT_DIRECTORY=STORAGE_NAME:[SUBPATH]`, where
- `MOUNT_DIRECTORY` - is the directory within the container
- `STORAGE_NAME` - is the name of the PDS
- `SUBPATH` - is the prefix within the COS bucket to mount.
@@ -437,10 +437,10 @@ Upload the .txt files from the local data directory to Cloud Object Storage
#### Step 2 - Run the fleet
-Launch the fleet to perform `wc` on each of the novels which defines the tasks from [wordcount_commands.jsonl](./wordcount_commands.jsonl) and mounts the input and output data stores.
+Launch the fleet to perform `wc` on each of the novels which defines the tasks from [wordcount_commands.jsonl](./wordcount_commands.jsonl) and mounts the input and output data stores.
```
./run_wordcount
-```
+```
Confirm that you uploaded the files with `#? 1`
@@ -509,7 +509,7 @@ Download the results from the output COS bucket to `./data/output`
```
./download
-````
+````
🚀 The example was successful, if you can tell the number of words of the "Alice in Wonderland" novel 🚀
@@ -558,6 +558,16 @@ An IBM Cloud Logs instance is being setup and enabled by default during the auto

+
+
+#### Using helper scripts
+
+If you want to quickly iterate you may use the helper script `fleet-logs` in order to display fleet logs directly in your terminal. The scripts uses a `LOGS_URL` env variable targetting your ICL endpoint, or if unset will try to find that ICL endpoint of your ICL instance. Additionally, you may specify `--fleet-id `, `--since `, `--tier ` and `--output ` to parameterize your query. Available values are directly taken from `ibmcloud logs query --help`.
+
+```
+./fleet-logs --fleet-id
+```
+
### How to customize fleet workers
> **Note:** This is an experimental feature to unlock specific use cases and might change or will be deprecated.
diff --git a/serverless-fleets/fleet_logs b/serverless-fleets/fleet_logs
new file mode 100755
index 00000000..3bc6ad9d
--- /dev/null
+++ b/serverless-fleets/fleet_logs
@@ -0,0 +1,63 @@
+#!/bin/bash
+set -eo pipefail
+
+ICL_ENDPOINT="${LOGS_URL}"
+REGION="${REGION:=eu-de}"
+NAME_PREFIX="${NAME_PREFIX:=ce-fleet-sandbox}"
+RESOURCE_GROUP="${NAME_PREFIX}--rg"
+ICL_NAME="${NAME_PREFIX}--icl"
+FLEET_ID=
+TIER="frequent_search"
+OUTPUT="logs-raw"
+SINCE="6h0m0s"
+
+SCRIPT_DIR=$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )
+source ${SCRIPT_DIR}/common.sh
+
+target_region $REGION > /dev/null
+target_resource_group $RESOURCE_GROUP > /dev/null
+
+ibmcloud plugin show logs > /dev/null
+
+if [[ $? -ne 0 ]]; then
+ ibmcloud plugin install logs
+fi
+
+if [[ -z "${ICL_ENDPOINT}" ]]; then
+ ICL_ENDPOINT="$(ibmcloud resource service-instance ${ICL_NAME} -o json | jq -r '.[].extensions.external_api')"
+fi
+
+while [[ $# -gt 0 ]]; do
+ case "$1" in
+ --fleet-id)
+ shift
+ FLEET_ID="$1"
+ ;;
+
+ --tier)
+ shift
+ TIER="$1"
+ ;;
+
+ --output)
+ shift
+ OUTPUT="$1"
+ ;;
+
+ --since)
+ shift
+ SINCE="$1"
+ ;;
+
+ *)
+ echo "Unknown flag $1"
+ exit 1
+ ;;
+ esac
+ shift
+done
+if [ -z "$FLEET_ID" ]; then
+ ibmcloud logs query --since ${SINCE} --tier ${TIER} --query "source logs | sort by \$m.timestamp" --service-url "${ICL_ENDPOINT}" --output ${OUTPUT}
+else
+ ibmcloud logs query --since ${SINCE} --tier ${TIER} --query "filter \$d.codeengine.fleetId ~ '${FLEET_ID}' | sort by \$m.timestamp" --service-url "${ICL_ENDPOINT}" --output ${OUTPUT}
+fi
\ No newline at end of file