From da4a8302d3b3c7cf490f4872a1f3bd251272a0e9 Mon Sep 17 00:00:00 2001 From: Glenn Date: Mon, 11 May 2026 08:46:18 -0400 Subject: [PATCH 1/3] Update af-agent-networks.adoc --- modules/ROOT/pages/af-agent-networks.adoc | 9 +++++++-- 1 file changed, 7 insertions(+), 2 deletions(-) diff --git a/modules/ROOT/pages/af-agent-networks.adoc b/modules/ROOT/pages/af-agent-networks.adoc index 46b6d4e50..d0e5f380c 100644 --- a/modules/ROOT/pages/af-agent-networks.adoc +++ b/modules/ROOT/pages/af-agent-networks.adoc @@ -128,7 +128,7 @@ image::agent-fabric-architecture.png[Agent Fabric showing agents and MCP servers [[llm-support]] == Large Language Models -Agent network brokers support the latest Gemini and OpenAI models. +Agent network brokers support the latest Gemini and OpenAI models and LLM Proxies. * Azure OpenAI and OpenAI API: + @@ -138,6 +138,7 @@ Agent network brokers support the latest Gemini and OpenAI models. ** GPT-5 nano ** GPT-5 + +For more information about these models, see the https://developers.openai.com/api/docs/models[OpenAI] documentation. * Gemini API: + ** Gemini 2.5 Pro @@ -145,8 +146,12 @@ Agent network brokers support the latest Gemini and OpenAI models. ** Gemini 2.5 Flash ** Gemini 3 Flash ** Gemini 3 Pro ++ +For more information about these models, see the https://ai.google.dev/gemini-api/docs/models[Gemini] documentation. -For more information about these models, see the https://developers.openai.com/api/docs/models[OpenAI] and https://ai.google.dev/gemini-api/docs/models[Gemini] documentation. +* LLM Proxy: ++ +LLM Proxy provide a unified endpoint for multiple LLM providers including Gemini, OpenAI, Bedrock (Anthropic Claude models), and NVIDIA Nemotron. To find the supported models, see xref:flex-gateway-llm-proxy.adoc#supported-llm-providers[LLM Proxy Supported LLM Providers]. This table details the requirements and recommended models. From 9383739a5d1fb7776bedde65912b9c2c24662580 Mon Sep 17 00:00:00 2001 From: Glenn Date: Tue, 12 May 2026 21:23:13 -0400 Subject: [PATCH 2/3] Edits --- modules/ROOT/pages/af-agent-networks.adoc | 2 +- modules/ROOT/pages/af-project-files.adoc | 4 ++-- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/modules/ROOT/pages/af-agent-networks.adoc b/modules/ROOT/pages/af-agent-networks.adoc index d0e5f380c..3d2a9f4ac 100644 --- a/modules/ROOT/pages/af-agent-networks.adoc +++ b/modules/ROOT/pages/af-agent-networks.adoc @@ -151,7 +151,7 @@ For more information about these models, see the https://ai.google.dev/gemini-ap * LLM Proxy: + -LLM Proxy provide a unified endpoint for multiple LLM providers including Gemini, OpenAI, Bedrock (Anthropic Claude models), and NVIDIA Nemotron. To find the supported models, see xref:flex-gateway-llm-proxy.adoc#supported-llm-providers[LLM Proxy Supported LLM Providers]. +LLM Proxy provides a unified endpoint for multiple LLM providers including Gemini, OpenAI, Bedrock (Anthropic Claude models), and NVIDIA Nemotron. To find the supported models, see xref:flex-gateway-llm-proxy.adoc#supported-llm-providers[LLM Proxy Supported LLM Providers]. This table details the requirements and recommended models. diff --git a/modules/ROOT/pages/af-project-files.adoc b/modules/ROOT/pages/af-project-files.adoc index bd8b3736d..727ab9c73 100644 --- a/modules/ROOT/pages/af-project-files.adoc +++ b/modules/ROOT/pages/af-project-files.adoc @@ -230,7 +230,7 @@ The `spec` element has these properties. `llm` -The value of this section is a reference to one of the LLMs defined in Anypoint Exchange or in the `llmProviders` section of `agent-network.yaml`. Because it's a reference, you can choose to share the same LLM across all the brokers in your agent network. Or, you can have different brokers use different LLMs to better suit their tasks. +The value of this section is a reference to one of the LLMs defined in Anypoint Exchange or in the `llmProviders` section of `agent-network.yaml`. Because it's a reference, you can choose to share the same LLM across all the brokers in your agent network. Or, you can have different brokers use different LLMs to better suit their tasks. If using an LLM Proxy, configure the LLM Proxy as either an OpenAI or Gemini LLM depending on the proxy's format. For more information about supported LLMs, see xref:af-agent-networks.adoc#llm-support[Large Language Models]. @@ -745,7 +745,7 @@ connections: spec: url: https://api.openai.com/v1/ configuration: - apiKey: ${openai.apiKey} + apiKey: ${openai.apiKey} # Define an LLM Proxy API key as <>:<> talent-pool-mcp: kind: mcp From b4e847156ac3d8df9358df268b81a356bf838044 Mon Sep 17 00:00:00 2001 From: Glenn Date: Tue, 12 May 2026 21:24:08 -0400 Subject: [PATCH 3/3] Update af-project-files.adoc --- modules/ROOT/pages/af-project-files.adoc | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/modules/ROOT/pages/af-project-files.adoc b/modules/ROOT/pages/af-project-files.adoc index 727ab9c73..5323fff21 100644 --- a/modules/ROOT/pages/af-project-files.adoc +++ b/modules/ROOT/pages/af-project-files.adoc @@ -745,7 +745,7 @@ connections: spec: url: https://api.openai.com/v1/ configuration: - apiKey: ${openai.apiKey} # Define an LLM Proxy API key as <>:<> + apiKey: ${openai.apiKey} # Define the API key of an LLM Proxy as <>:<> talent-pool-mcp: kind: mcp