A stupid configurable bot designed for StormChat! that links with LLMs hosted on Ollama.
Setup is probably easy...
Set parameters in config.txt. The defaults should work with the exception of username, password, and possibly chatBackend (Stormchat recently underwent a domain name change).
Compile from source.
Once that's done, run ollama serve to start the Ollama server.
Once ollama is running, you may run the executable you compiled.
This project was tested on Fedora and it works... decently well...
Once connected to the websocket and Ollama server, StormyLlama will reply to every message sent in the chatroom.
Unfortunatey, StormyyyLlama doesn't do message context, and probably never will.