What is Ollama
Ollama provides an easy way to host LLMs locally and to provide a REST API for the model. Refer to the following resources to get started: Once you start a model with a command likeollama run llama2
, you can verify that the API works with a curl request: