At setup you can select any model you already have installed with ollama.
But latter on, if you have selected an unsupported model, a message will be displayed:
Check if g:ollama_host=http://localhost:11434 is correct.
After checking the plugin doc, you can find a log file that contains:
OllamaLogger - ERROR - Config file /home/mathieu/config/.vim/bundle/vim-ollama/python/configs/mistral.json not found
Having the possible usable model filtered may help to reduce this tedious first setup :)
At setup you can select any model you already have installed with ollama.
But latter on, if you have selected an unsupported model, a message will be displayed:
Check if g:ollama_host=http://localhost:11434 is correct.After checking the plugin doc, you can find a log file that contains:
OllamaLogger - ERROR - Config file /home/mathieu/config/.vim/bundle/vim-ollama/python/configs/mistral.json not foundHaving the possible usable model filtered may help to reduce this tedious first setup :)