README: Update with instructions for perplexity.ai (#204)
README.org: perplexity.ai provides an OpenAI-style API. Also remove some extra whitespace.
This commit is contained in:
parent
af5444a2ea
commit
d8c604b53b
1 changed files with 32 additions and 3 deletions
29
README.org
29
README.org
|
@ -17,6 +17,7 @@ GPTel is a simple Large Language Model chat client for Emacs, with support for m
|
||||||
| Kagi Summarizer | ✓ | [[https://kagi.com/settings?p=api][API key]] |
|
| Kagi Summarizer | ✓ | [[https://kagi.com/settings?p=api][API key]] |
|
||||||
| together.ai | ✓ | [[https://api.together.xyz/settings/api-keys][API key]] |
|
| together.ai | ✓ | [[https://api.together.xyz/settings/api-keys][API key]] |
|
||||||
| Anyscale | ✓ | [[https://docs.endpoints.anyscale.com/][API key]] |
|
| Anyscale | ✓ | [[https://docs.endpoints.anyscale.com/][API key]] |
|
||||||
|
| Perplexity | ✓ | [[https://docs.perplexity.ai/docs/getting-started][API key]] |
|
||||||
|
|
||||||
*General usage*: ([[https://www.youtube.com/watch?v=bsRnh_brggM][YouTube Demo]])
|
*General usage*: ([[https://www.youtube.com/watch?v=bsRnh_brggM][YouTube Demo]])
|
||||||
|
|
||||||
|
@ -55,6 +56,7 @@ GPTel uses Curl if available, but falls back to url-retrieve to work without ext
|
||||||
- [[#kagi-fastgpt--summarizer][Kagi (FastGPT & Summarizer)]]
|
- [[#kagi-fastgpt--summarizer][Kagi (FastGPT & Summarizer)]]
|
||||||
- [[#togetherai][together.ai]]
|
- [[#togetherai][together.ai]]
|
||||||
- [[#anyscale][Anyscale]]
|
- [[#anyscale][Anyscale]]
|
||||||
|
- [[#perplexity][Perplexity]]
|
||||||
- [[#usage][Usage]]
|
- [[#usage][Usage]]
|
||||||
- [[#in-any-buffer][In any buffer:]]
|
- [[#in-any-buffer][In any buffer:]]
|
||||||
- [[#in-a-dedicated-chat-buffer][In a dedicated chat buffer:]]
|
- [[#in-a-dedicated-chat-buffer][In a dedicated chat buffer:]]
|
||||||
|
@ -329,6 +331,33 @@ You can pick this backend from the menu when using gptel (see [[#usage][Usage]])
|
||||||
gptel-model "mistralai/Mixtral-8x7B-Instruct-v0.1")
|
gptel-model "mistralai/Mixtral-8x7B-Instruct-v0.1")
|
||||||
#+end_src
|
#+end_src
|
||||||
|
|
||||||
|
#+html: </details>
|
||||||
|
#+html: <details><summary>
|
||||||
|
**** Perplexity
|
||||||
|
#+html: </summary>
|
||||||
|
|
||||||
|
Register a backend with
|
||||||
|
#+begin_src emacs-lisp
|
||||||
|
;; Perplexity offers an OpenAI compatible API
|
||||||
|
(gptel-make-openai "Perplexity" ;Any name you want
|
||||||
|
:host "api.perplexity.ai"
|
||||||
|
:key "your-api-key" ;can be a function that returns the key
|
||||||
|
:endpoint "/chat/completions"
|
||||||
|
:stream t
|
||||||
|
:models '(;; has many more, check perplexity.ai
|
||||||
|
"pplx-7b-chat"
|
||||||
|
"pplx-70b-chat"
|
||||||
|
"pplx-7b-online"
|
||||||
|
"pplx-70b-online"))
|
||||||
|
#+end_src
|
||||||
|
|
||||||
|
You can pick this backend from the menu when using gptel (see [[#usage][Usage]]), or set this as the default value of =gptel-backend=:
|
||||||
|
#+begin_src emacs-lisp
|
||||||
|
;; OPTIONAL configuration
|
||||||
|
(setq-default gptel-backend (gptel-make-openai "Perplexity" ...)
|
||||||
|
gptel-model "pplx-7b-chat")
|
||||||
|
#+end_src
|
||||||
|
|
||||||
#+html: </details>
|
#+html: </details>
|
||||||
|
|
||||||
** Usage
|
** Usage
|
||||||
|
|
Loading…
Add table
Reference in a new issue