Skip to content

LiteLLM support #6

@dhruv-anand-aintech

Description

@dhruv-anand-aintech

Hi, thanks a lot for this demo repo.
It'd be great if we didn't need to use our OpenAI key for running it.
Could you consider taking in a model name as input, and then putting that in a litellm.chat_completion.create() call instead of openai... in the same format?

That'll allow us to set our own LLMs (over 100 options!): https://litellm.vercel.app/docs

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions