From e67ed41e31086d384f211671bb6061be3f8325e0 Mon Sep 17 00:00:00 2001 From: Karthik Chikmagalur Date: Thu, 4 Jan 2024 17:25:35 -0800 Subject: [PATCH] README: Specify: no key needed for llama backend * README.org: Specify that no key is needed when defining a Llama.cpp backend. Fix #170. --- README.org | 1 + 1 file changed, 1 insertion(+) diff --git a/README.org b/README.org index 636063d..2d34371 100644 --- a/README.org +++ b/README.org @@ -237,6 +237,7 @@ Register a backend with :stream t ;Stream responses :protocol "http" :host "localhost:8000" ;Llama.cpp server location, typically localhost:8080 for Llamafile + :key nil ;No key needed :models '("test")) ;Any names, doesn't matter for Llama #+end_src These are the required parameters, refer to the documentation of =gptel-make-openai= for more.