diff --git a/README.md b/README.md index 6fee7bb..5f361ef 100644 --- a/README.md +++ b/README.md @@ -5,22 +5,21 @@ > I really appreciate the interest in my project, but its going to take some time for me to go through everything. > Hope you understand :) Please open Issues (not dms / emails), i promise i will get to them asap > -> Docker / Networking things are a bit bumpy at the moment, please just open an issue if you think, that something should work but isn't working. There are a lot of different host system configurations out there, and i can't test them by myself. +> Docker / Networking things are a bit bumpy at the moment, please just open an issue if you think, that something should work but isn't working. There are a lot of different host system configurations out there, and i can't test them by myself. ## What it is LLocalSearch is a completely locally running search aggregator using LLM Agents. The user can ask a question and the system will use a chain of LLMs to find the answer. The user can see the progress of the agents and the final answer. No OpenAI or Google API keys are needed. -Now with follow-up questions: - -https://github.com/nilsherzig/LLocalSearch/assets/72463901/8323be25-de2a-4ddf-853b-4a01557b5599 +Now with follow-up questions: + ![image](https://github.com/nilsherzig/LLocalSearch/assets/72463901/9f6497aa-8047-4d11-9a12-66aff65d3faa) -## Features +## Features -- 🕵️ Completely local (no need for API keys) +- 🕵️ Completely local (no need for API keys) - 💸 Runs on "low end" LLM Hardware (demo video uses a 7b model) - 🤓 Progress logs, allowing for a better understanding of the search process - 🤔 Follow-up questions @@ -29,11 +28,11 @@ https://github.com/nilsherzig/LLocalSearch/assets/72463901/8323be25-de2a-4ddf-85 - 🌐 Web interface, allowing for easy access from any device - 💮 Handcrafted UI with light and dark mode -## Status +## Status -This project is still in its very early days. Expect some bugs. +This project is still in its very early days. Expect some bugs. -## How it works +## How it works Please read [infra](https://github.com/nilsherzig/LLocalSearch/issues/17) to get the most up-to-date idea. @@ -42,7 +41,7 @@ Please read [infra](https://github.com/nilsherzig/LLocalSearch/issues/17) to get ### Requirements - A running [Ollama](https://ollama.com/) server, reachable from the container - - GPU is not needed, but recommended + - GPU is not needed, but recommended - Docker Compose ### Run the latest release @@ -56,9 +55,9 @@ cd ./LLocalSearch docker-compose up ``` -🎉 You should now be able to open the web interface on http://localhost:3000. Nothing else is exposed by default. +🎉 You should now be able to open the web interface on . Nothing else is exposed by default. -### Run the current git version +### Run the current git version Newer features, but potentially less stable.