Gen AI Configs
Ollama
Configure Danswer to use Ollama
Refer to Model Configs for how to set the environment variables for your particular deployment.
Note: While we support self hosted LLMs, you will get significantly better responses with a more powerful model like GPT-4.
What is Ollama
Ollama provides an easy way to host LLMs locally and to provide a REST API for the model. Refer to the following resources to get started:
Once you start a model with a command like ollama run llama2
, you can verify that the API works with a curl request: