Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
23 changes: 11 additions & 12 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,22 +5,21 @@
> I really appreciate the interest in my project, but its going to take some time for me to go through everything.
> Hope you understand :) Please open Issues (not dms / emails), i promise i will get to them asap
>
> Docker / Networking things are a bit bumpy at the moment, please just open an issue if you think, that something should work but isn't working. There are a lot of different host system configurations out there, and i can't test them by myself.
> Docker / Networking things are a bit bumpy at the moment, please just open an issue if you think, that something should work but isn't working. There are a lot of different host system configurations out there, and i can't test them by myself.

## What it is

LLocalSearch is a completely locally running search aggregator using LLM Agents. The user can ask a question and the system will use a chain of LLMs to find the answer. The user can see the progress of the agents and the final answer. No OpenAI or Google API keys are needed.

Now with follow-up questions:

https://github.com/nilsherzig/LLocalSearch/assets/72463901/8323be25-de2a-4ddf-853b-4a01557b5599
Now with follow-up questions:

<https://github.com/nilsherzig/LLocalSearch/assets/72463901/8323be25-de2a-4ddf-853b-4a01557b5599>

![image](https://github.com/nilsherzig/LLocalSearch/assets/72463901/9f6497aa-8047-4d11-9a12-66aff65d3faa)

## Features
## Features

- 🕵️ Completely local (no need for API keys)
- 🕵️ Completely local (no need for API keys)
- 💸 Runs on "low end" LLM Hardware (demo video uses a 7b model)
- 🤓 Progress logs, allowing for a better understanding of the search process
- 🤔 Follow-up questions
Expand All @@ -29,11 +28,11 @@ https://github.com/nilsherzig/LLocalSearch/assets/72463901/8323be25-de2a-4ddf-85
- 🌐 Web interface, allowing for easy access from any device
- 💮 Handcrafted UI with light and dark mode

## Status
## Status

This project is still in its very early days. Expect some bugs.
This project is still in its very early days. Expect some bugs.

## How it works
## How it works

Please read [infra](https://github.com/nilsherzig/LLocalSearch/issues/17) to get the most up-to-date idea.

Expand All @@ -42,7 +41,7 @@ Please read [infra](https://github.com/nilsherzig/LLocalSearch/issues/17) to get
### Requirements

- A running [Ollama](https://ollama.com/) server, reachable from the container
- GPU is not needed, but recommended
- GPU is not needed, but recommended
- Docker Compose

### Run the latest release
Expand All @@ -56,9 +55,9 @@ cd ./LLocalSearch
docker-compose up
```

🎉 You should now be able to open the web interface on http://localhost:3000. Nothing else is exposed by default.
🎉 You should now be able to open the web interface on <http://localhost:3000>. Nothing else is exposed by default.

### Run the current git version
### Run the current git version

Newer features, but potentially less stable.

Expand Down