Download and install Ollama from ollama.com
Install and run llama3 using ollama. See full list of model at https://ollama.com/library
ollama run llama3Set the following configuration for tool to use the local Ollama model.
- make sure you prefix the model with
ollama - default value for
ollama_urlishttp://localhost:11434/api/chat
ai-git-commit config --model=ollama/...
ai-git-commit config --ollama_url=http://localhost:11434/api/chat