README: Specify: no key needed for llama backend
* README.org: Specify that no key is needed when defining a Llama.cpp backend. Fix #170.
This commit is contained in:
parent
febeada960
commit
e67ed41e31
1 changed files with 1 additions and 0 deletions
|
@ -237,6 +237,7 @@ Register a backend with
|
|||
:stream t ;Stream responses
|
||||
:protocol "http"
|
||||
:host "localhost:8000" ;Llama.cpp server location, typically localhost:8080 for Llamafile
|
||||
:key nil ;No key needed
|
||||
:models '("test")) ;Any names, doesn't matter for Llama
|
||||
#+end_src
|
||||
These are the required parameters, refer to the documentation of =gptel-make-openai= for more.
|
||||
|
|
Loading…
Add table
Reference in a new issue