Important
I've pushed some very important performance upgrades and hotfixes to the container registry. Please run "docker-compose pull" to update your containers.
05.04.2024 switching the embedding model, please update containers. I will add an update notification to the webui in the future :).
LLocalSearch is a completely locally running search aggregator using LLM Agents. The user can ask a question and the system will use a chain of LLMs to find the answer. The user can see the progress of the agents and the final answer. No OpenAI or Google API keys are needed.
Now with follow-up questions:
demo.mp4
- 🕵️ Completely local (no need for API keys)
- 💸 Runs on "low end" LLM Hardware (demo video uses a 7b model)
- 🤓 Progress logs, allowing for a better understanding of the search process
- 🤔 Follow-up questions
- 📱 Mobile friendly interface
- 🚀 Fast and easy to deploy with Docker Compose
- 🌐 Web interface, allowing for easy access from any device
- 💮 Handcrafted UI with light and dark mode
This project is still in its very early days. Expect some bugs.
Please read infra to get the most up-to-date idea.
- A running Ollama server, reachable from the container
- GPU is not needed, but recommended
- 🔴 make sure that Ollama is not just listening on localhost but on all interfaces (or at least the docker network). You don't have to change anything if you're using ollama inside docker.
- Docker Compose
Recommended, if you don't intend to develop on this project.
git clone https://github.com/nilsherzig/LLocalSearch.git
cd ./LLocalSearch
# 🔴 check the env vars inside the compose file (and `env-example` file) and change them if needed
docker-compose up
🎉 You should now be able to open the web interface on http://localhost:3000. Nothing else is exposed by default.
Newer features, but potentially less stable.
git clone https://github.com/nilsherzig/LLocalsearch.git
# 1. make sure to check the env vars inside the `docker-compose.dev.yaml`.
# 2. Make sure you've really checked the dev compose file not the normal one.
# 3. build the containers and start the services
make dev
# Both front and backend will hot reload on code changes.
If you don't have make
installed, you can run the commands inside the Makefile manually.
Now you should be able to access the frontend on http://localhost:3000.
Kinda looks like im botting haha