Currently does not support Ollama. Need someone to setup the scaffolding for Ollama as an LLM provider. Must support structured outputs!