A series of experiments with local LLMs
- Run mistral 7b locally
- Run llama 2 locally
- Run custom prompts with llama 2
- Run API calls against llama/mistral
- Done with caveats
- Create chat UI similar to this
- Custom prompts + temps
| Name | Name | Last commit date | ||
|---|---|---|---|---|
A series of experiments with local LLMs