Update README.md
This commit is contained in:
committed by
GitHub
parent
5b28a34e5b
commit
310639318f
@@ -19,6 +19,8 @@ The goal is:
|
||||
|
||||
Note: Be warned! It was hacked in a weekend, and it's just an experiment to see what can be done with local LLMs.
|
||||
|
||||

|
||||
|
||||
## Demo
|
||||
|
||||
Search on internet (interactive mode)
|
||||
@@ -178,4 +180,4 @@ docker-compose run -v main.py:/app/main.py -i --rm localagi
|
||||
- With superhot models looses its magic, but maybe suitable for search
|
||||
- Context size is your enemy. `--postprocess` some times helps, but not always
|
||||
- It can be silly!
|
||||
- It is slow on CPU, don't expect `7b` models to perform good, and `13b` models perform better but on CPU are quite slow.
|
||||
- It is slow on CPU, don't expect `7b` models to perform good, and `13b` models perform better but on CPU are quite slow.
|
||||
|
||||
Reference in New Issue
Block a user