Skip to content

snoutiscratch/StormyyyLlama

Repository files navigation

StormyyyLlama

A stupid configurable bot designed for StormChat! that links with LLMs hosted on Ollama.


Setup

Setup is probably easy...

Parameters

Set parameters in config.txt. The defaults should work with the exception of username, password, and possibly chatBackend (Stormchat recently underwent a domain name change).

Running the bot

Compile from source.

Once that's done, run ollama serve to start the Ollama server.

Once ollama is running, you may run the executable you compiled.

This project was tested on Fedora and it works... decently well...

Usage

Once connected to the websocket and Ollama server, StormyLlama will reply to every message sent in the chatroom.

Unfortunatey, StormyyyLlama doesn't do message context, and probably never will.

About

Ollama bot for stormchat

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages