If your agent uses tools, choose an Ollama model that supports function calling.
Tool support can be verified with:
ollama show llama3.1Model architecture llama parameters 8.0B context length 131072 embedding length 5120 quantization Q4_K_M
Capabilities completion vision tools
Models must list tools under Capabilities. Models without tool support will not execute ADK functions correctly.
To inspect or customize a model template:
ollama show --modelfile llama3.1 > model_file_to_modifyThen create a modified model:
ollama create llama3.1-modified -f model_file_to_modify
ADK includes a native Ollama model class that communicates directly with the Ollama server at:
http://localhost:11434/api/chat
No LiteLLM provider, API keys, or OpenAI proxy endpoints are needed.
import random
from google.adk.agents.llm_agent import Agent
from google.adk.models.ollama_llm import Ollama
def roll_die(sides: int) -> int:
return random.randint(1, sides)
def check_prime(numbers: list) -> str:
"""Check if a given list of values contains prime numbers.
The input may include non-integer values produced by the LLM.
"""
primes = set()
for number in numbers:
try:
number = int(number)
except (ValueError, TypeError):
continue
if number <= 1:
continue
for i in range(2, int(number**0.5) + 1):
if number % i == 0:
break
else:
primes.add(number)
return (
"No prime numbers found."
if not primes
else f"{', '.join(str(n) for n in sorted(primes))} are prime numbers."
)
root_agent = Agent(
model=Ollama(model="llama3.1"),
name="dice_agent",
description="Agent that rolls dice and checks primes using native Ollama.",
instruction="Always use the provided tools.",
tools=[roll_die, check_prime],
)Default Ollama endpoint:
Override using an environment variable:
export OLLAMA_API_BASE="http://192.168.1.20:11434"Or pass explicitly in code:
Ollama(model="llama3.1", host="http://192.168.1.20:11434")Start the ADK Web UI:
adk web hello_ollama_native
The interface will be available in your browser, allowing interactive testing of tool calls.