README: Add support for Groq (#257)

* README.org: Mention details for configuring Groq.
This commit is contained in:
Karthik Chikmagalur 2024-03-16 20:27:20 -07:00
parent 94b13e78ec
commit e3b3591d73

View file

@ -19,6 +19,7 @@ GPTel is a simple Large Language Model chat client for Emacs, with support for m
| Anyscale | ✓ | [[https://docs.endpoints.anyscale.com/][API key]] |
| Perplexity | ✓ | [[https://docs.perplexity.ai/docs/getting-started][API key]] |
| Anthropic (Claude) | ✓ | [[https://www.anthropic.com/api][API key]] |
| Groq | ✓ | [[https://console.groq.com/keys][API key]] |
*General usage*: ([[https://www.youtube.com/watch?v=bsRnh_brggM][YouTube Demo]])
@ -59,6 +60,7 @@ GPTel uses Curl if available, but falls back to url-retrieve to work without ext
- [[#anyscale][Anyscale]]
- [[#perplexity][Perplexity]]
- [[#anthropic-claude][Anthropic (Claude)]]
- [[#groq][Groq]]
- [[#usage][Usage]]
- [[#in-any-buffer][In any buffer:]]
- [[#in-a-dedicated-chat-buffer][In a dedicated chat buffer:]]
@ -465,6 +467,44 @@ The above code makes the backend available to select. If you want it to be the
:stream t :key "your-api-key"))
#+end_src
#+html: </details>
#+html: <details><summary>
**** Groq
#+html: </summary>
Register a backend with
#+begin_src emacs-lisp
;; Groq offers an OpenAI compatible API
(gptel-make-openai "Groq" ;Any name you want
:host "api.groq.com"
:endpoint "/openai/v1/chat/completions"
:stream t
:key "your-api-key" ;can be a function that returns the key
:models '("mixtral-8x7b-32768"
"gemma-7b-it"
"llama2-70b-4096"))
#+end_src
You can pick this backend from the menu when using gptel (see [[#usage][Usage]]). Note that Groq is fast enough that you could easily set =:stream nil= and still get near-instant responses.
***** (Optional) Set as the default gptel backend
The above code makes the backend available to select. If you want it to be the default backend for gptel, you can set this as the default value of =gptel-backend=. Use this instead of the above.
#+begin_src emacs-lisp
;; OPTIONAL configuration
(setq-default
gptel-model "mixtral-8x7b-32768"
gptel-backend
(gptel-make-openai "Groq"
:host "api.groq.com"
:endpoint "/openai/v1/chat/completions"
:stream t
:key "your-api-key"
:models '("mixtral-8x7b-32768"
"gemma-7b-it"
"llama2-70b-4096")))
#+end_src
#+html: </details>
** Usage