README: Specify: no key needed for llama backend

* README.org: Specify that no key is needed when defining a
Llama.cpp backend.  Fix #170.
This commit is contained in:
Karthik Chikmagalur 2024-01-04 17:25:35 -08:00
parent febeada960
commit e67ed41e31

View file

@ -237,6 +237,7 @@ Register a backend with
:stream t ;Stream responses
:protocol "http"
:host "localhost:8000" ;Llama.cpp server location, typically localhost:8080 for Llamafile
:key nil ;No key needed
:models '("test")) ;Any names, doesn't matter for Llama
#+end_src
These are the required parameters, refer to the documentation of =gptel-make-openai= for more.