diff --git a/modules/ROOT/pages/af-agent-networks.adoc b/modules/ROOT/pages/af-agent-networks.adoc index 46b6d4e50..3d2a9f4ac 100644 --- a/modules/ROOT/pages/af-agent-networks.adoc +++ b/modules/ROOT/pages/af-agent-networks.adoc @@ -128,7 +128,7 @@ image::agent-fabric-architecture.png[Agent Fabric showing agents and MCP servers [[llm-support]] == Large Language Models -Agent network brokers support the latest Gemini and OpenAI models. +Agent network brokers support the latest Gemini and OpenAI models and LLM Proxies. * Azure OpenAI and OpenAI API: + @@ -138,6 +138,7 @@ Agent network brokers support the latest Gemini and OpenAI models. ** GPT-5 nano ** GPT-5 + +For more information about these models, see the https://developers.openai.com/api/docs/models[OpenAI] documentation. * Gemini API: + ** Gemini 2.5 Pro @@ -145,8 +146,12 @@ Agent network brokers support the latest Gemini and OpenAI models. ** Gemini 2.5 Flash ** Gemini 3 Flash ** Gemini 3 Pro ++ +For more information about these models, see the https://ai.google.dev/gemini-api/docs/models[Gemini] documentation. -For more information about these models, see the https://developers.openai.com/api/docs/models[OpenAI] and https://ai.google.dev/gemini-api/docs/models[Gemini] documentation. +* LLM Proxy: ++ +LLM Proxy provides a unified endpoint for multiple LLM providers including Gemini, OpenAI, Bedrock (Anthropic Claude models), and NVIDIA Nemotron. To find the supported models, see xref:flex-gateway-llm-proxy.adoc#supported-llm-providers[LLM Proxy Supported LLM Providers]. This table details the requirements and recommended models. diff --git a/modules/ROOT/pages/af-project-files.adoc b/modules/ROOT/pages/af-project-files.adoc index bd8b3736d..5323fff21 100644 --- a/modules/ROOT/pages/af-project-files.adoc +++ b/modules/ROOT/pages/af-project-files.adoc @@ -230,7 +230,7 @@ The `spec` element has these properties. `llm` -The value of this section is a reference to one of the LLMs defined in Anypoint Exchange or in the `llmProviders` section of `agent-network.yaml`. Because it's a reference, you can choose to share the same LLM across all the brokers in your agent network. Or, you can have different brokers use different LLMs to better suit their tasks. +The value of this section is a reference to one of the LLMs defined in Anypoint Exchange or in the `llmProviders` section of `agent-network.yaml`. Because it's a reference, you can choose to share the same LLM across all the brokers in your agent network. Or, you can have different brokers use different LLMs to better suit their tasks. If using an LLM Proxy, configure the LLM Proxy as either an OpenAI or Gemini LLM depending on the proxy's format. For more information about supported LLMs, see xref:af-agent-networks.adoc#llm-support[Large Language Models]. @@ -745,7 +745,7 @@ connections: spec: url: https://api.openai.com/v1/ configuration: - apiKey: ${openai.apiKey} + apiKey: ${openai.apiKey} # Define the API key of an LLM Proxy as <>:<> talent-pool-mcp: kind: mcp