Skip to content

Latest commit

 

History

History
117 lines (81 loc) · 2.53 KB

File metadata and controls

117 lines (81 loc) · 2.53 KB

Using Ollama Models with ADK (Native Integration)

Model Choice

If your agent uses tools, choose an Ollama model that supports function calling.
Tool support can be verified with:

ollama show llama3.1

Model architecture llama parameters 8.0B context length 131072 embedding length 5120 quantization Q4_K_M

Capabilities completion vision tools

Models must list tools under Capabilities. Models without tool support will not execute ADK functions correctly.

To inspect or customize a model template:

ollama show --modelfile llama3.1 > model_file_to_modify

Then create a modified model:

ollama create llama3.1-modified -f model_file_to_modify

Native Ollama Provider in ADK

ADK includes a native Ollama model class that communicates directly with the Ollama server at:

http://localhost:11434/api/chat

No LiteLLM provider, API keys, or OpenAI proxy endpoints are needed.

Example agent

import random
from google.adk.agents.llm_agent import Agent
from google.adk.models.ollama_llm import Ollama


def roll_die(sides: int) -> int:
    return random.randint(1, sides)


def check_prime(numbers: list) -> str:
    """Check if a given list of values contains prime numbers.

    The input may include non-integer values produced by the LLM.
    """
    primes = set()

    for number in numbers:
        try:
            number = int(number)
        except (ValueError, TypeError):
            continue

        if number <= 1:
            continue

        for i in range(2, int(number**0.5) + 1):
            if number % i == 0:
                break
        else:
            primes.add(number)

    return (
        "No prime numbers found."
        if not primes
        else f"{', '.join(str(n) for n in sorted(primes))} are prime numbers."
    )


root_agent = Agent(
    model=Ollama(model="llama3.1"),
    name="dice_agent",
    description="Agent that rolls dice and checks primes using native Ollama.",
    instruction="Always use the provided tools.",
    tools=[roll_die, check_prime],
)

Connecting to a remote Ollama server

Default Ollama endpoint:

http://localhost:11434

Override using an environment variable:

export OLLAMA_API_BASE="http://192.168.1.20:11434"

Or pass explicitly in code:

Ollama(model="llama3.1", host="http://192.168.1.20:11434")

Running the Example with ADK Web

Start the ADK Web UI:

adk web hello_ollama_native

The interface will be available in your browser, allowing interactive testing of tool calls.