Update README.md

This commit is contained in:
Ettore Di Giacinto
2023-08-05 22:46:07 +02:00
committed by GitHub
parent 5b28a34e5b
commit 310639318f

View File

@@ -19,6 +19,8 @@ The goal is:
Note: Be warned! It was hacked in a weekend, and it's just an experiment to see what can be done with local LLMs.
![Screenshot from 2023-08-05 22-40-40](https://github.com/mudler/LocalAGI/assets/2420543/fc9d3c5d-d522-467b-9a84-fea18a78e75f)
## Demo
Search on internet (interactive mode)
@@ -178,4 +180,4 @@ docker-compose run -v main.py:/app/main.py -i --rm localagi
- With superhot models looses its magic, but maybe suitable for search
- Context size is your enemy. `--postprocess` some times helps, but not always
- It can be silly!
- It is slow on CPU, don't expect `7b` models to perform good, and `13b` models perform better but on CPU are quite slow.
- It is slow on CPU, don't expect `7b` models to perform good, and `13b` models perform better but on CPU are quite slow.