Bring Your Own LLM
You can use any LLM that’s compatible with OpenAI’s API with Savvy.
Let’s modify Savvy’s config file to use codellama:13b
running locally via ollama
Save your changes and Savvy’s CLI will immediately start routing all LLM calls to the configured model and URL