gptel-kagi: Add support for the Kagi summarizer

* gptel-kagi.el (gptel--request-data, gptel--parse-buffer,
gptel-make-kagi): Add support for the Kagi summarizer.  If there
is a url at point (or at the end of the provided prompt), it is
used as the summarizer input.  Otherwise the behavior is
unchanged.

* README (Kagi): Mention summarizer support.

* gptel.el: Mention summarizer support.
This commit is contained in:
Karthik Chikmagalur 2024-01-15 16:09:41 -08:00
parent c6a07043af
commit 1752f1d589
3 changed files with 76 additions and 45 deletions

View file

@ -5,7 +5,7 @@
GPTel is a simple Large Language Model chat client for Emacs, with support for multiple models and backends. GPTel is a simple Large Language Model chat client for Emacs, with support for multiple models and backends.
| LLM Backend | Supports | Requires | | LLM Backend | Supports | Requires |
|--------------+----------+---------------------------| |-----------------+----------+---------------------------|
| ChatGPT | ✓ | [[https://platform.openai.com/account/api-keys][API key]] | | ChatGPT | ✓ | [[https://platform.openai.com/account/api-keys][API key]] |
| Azure | ✓ | Deployment and API key | | Azure | ✓ | Deployment and API key |
| Ollama | ✓ | [[https://ollama.ai/][Ollama running locally]] | | Ollama | ✓ | [[https://ollama.ai/][Ollama running locally]] |
@ -14,6 +14,7 @@ GPTel is a simple Large Language Model chat client for Emacs, with support for m
| Llama.cpp | ✓ | [[https://github.com/ggerganov/llama.cpp/tree/master/examples/server#quick-start][Llama.cpp running locally]] | | Llama.cpp | ✓ | [[https://github.com/ggerganov/llama.cpp/tree/master/examples/server#quick-start][Llama.cpp running locally]] |
| Llamafile | ✓ | [[https://github.com/Mozilla-Ocho/llamafile#quickstart][Local Llamafile server]] | | Llamafile | ✓ | [[https://github.com/Mozilla-Ocho/llamafile#quickstart][Local Llamafile server]] |
| Kagi FastGPT | ✓ | [[https://kagi.com/settings?p=api][API key]] | | Kagi FastGPT | ✓ | [[https://kagi.com/settings?p=api][API key]] |
| Kagi Summarizer | ✓ | [[https://kagi.com/settings?p=api][API key]] |
| PrivateGPT | Planned | - | | PrivateGPT | Planned | - |
*General usage*: ([[https://www.youtube.com/watch?v=bsRnh_brggM][YouTube Demo]]) *General usage*: ([[https://www.youtube.com/watch?v=bsRnh_brggM][YouTube Demo]])
@ -49,7 +50,7 @@ GPTel uses Curl if available, but falls back to url-retrieve to work without ext
- [[#ollama][Ollama]] - [[#ollama][Ollama]]
- [[#gemini][Gemini]] - [[#gemini][Gemini]]
- [[#llamacpp-or-llamafile][Llama.cpp or Llamafile]] - [[#llamacpp-or-llamafile][Llama.cpp or Llamafile]]
- [[#kagi-fastgpt][Kagi FastGPT]] - [[#kagi-fastgpt--summarizer][Kagi (FastGPT & Summarizer)]]
- [[#usage][Usage]] - [[#usage][Usage]]
- [[#in-any-buffer][In any buffer:]] - [[#in-any-buffer][In any buffer:]]
- [[#in-a-dedicated-chat-buffer][In a dedicated chat buffer:]] - [[#in-a-dedicated-chat-buffer][In a dedicated chat buffer:]]
@ -252,28 +253,33 @@ You can pick this backend from the menu when using gptel (see [[#usage][Usage]])
#+html: </details> #+html: </details>
#+html: <details><summary> #+html: <details><summary>
**** Kagi FastGPT **** Kagi (FastGPT & Summarizer)
#+html: </summary> #+html: </summary>
*NOTE*: Kagi's FastGPT model does not support multi-turn conversations, interactions are "one-shot". It also does not support streaming responses. Kagi's FastGPT model and the Universal Summarizer are both supported. A couple of notes:
1. Universal Summarizer: If there is a URL at point, the summarizer will summarize the contents of the URL. Otherwise the context sent to the model is the same as always: the buffer text upto point, or the contents of the region if the region is active.
2. Kagi models do not support multi-turn conversations, interactions are "one-shot". They also do not support streaming responses.
Register a backend with Register a backend with
#+begin_src emacs-lisp #+begin_src emacs-lisp
;; :key can be a function that returns the API key
(gptel-make-kagi (gptel-make-kagi
"Kagi" ;Name of your choice "Kagi" ;any name
:key "YOUR_KAGI_API_KEY") :key "YOUR_KAGI_API_KEY") ;:key can be a function
#+end_src #+end_src
These are the required parameters, refer to the documentation of =gptel-make-kagi= for more. These are the required parameters, refer to the documentation of =gptel-make-kagi= for more.
You can pick this backend from the transient menu when using gptel (see Usage), or set this as the default value of =gptel-backend=: You can pick this backend and the model (fastgpt/summarizer) from the transient menu when using gptel. Alternatively you can set this as the default value of =gptel-backend=:
#+begin_src emacs-lisp #+begin_src emacs-lisp
;; OPTIONAL configuration ;; OPTIONAL configuration
(setq-default gptel-model "fastgpt" ;only supported Kagi model (setq-default gptel-model "fastgpt"
gptel-backend (gptel-make-kagi "Kagi" :key ...)) gptel-backend (gptel-make-kagi "Kagi" :key ...))
#+end_src #+end_src
The alternatives to =fastgpt= include =summarize:cecil=, =summarize:agnes=, =summarize:daphne= and =summarize:muriel=. The difference between the summarizer engines is [[https://help.kagi.com/kagi/api/summarizer.html#summarization-engines][documented here]].
#+html: </details> #+html: </details>
** Usage ** Usage

View file

@ -69,42 +69,65 @@
(cl-defmethod gptel--request-data ((_backend gptel-kagi) prompts) (cl-defmethod gptel--request-data ((_backend gptel-kagi) prompts)
"JSON encode PROMPTS for sending to ChatGPT." "JSON encode PROMPTS for sending to ChatGPT."
(pcase-exhaustive gptel-model
("fastgpt"
`(,@prompts :web_search t :cache t)) `(,@prompts :web_search t :cache t))
((and model (guard (string-prefix-p "summarize" model)))
`(,@prompts :engine ,(substring model 10)))))
(cl-defmethod gptel--parse-buffer ((_backend gptel-kagi) &optional _max-entries) (cl-defmethod gptel--parse-buffer ((_backend gptel-kagi) &optional _max-entries)
(let ((prompts) (let ((url (or (thing-at-point 'url)
(get-text-property (point) 'shr-url)
(get-text-property (point) 'image-url)))
;; (filename (thing-at-point 'existing-filename)) ;no file upload support yet
(prop (text-property-search-backward (prop (text-property-search-backward
'gptel 'response 'gptel 'response
(when (get-char-property (max (point-min) (1- (point))) (when (get-char-property (max (point-min) (1- (point)))
'gptel) 'gptel)
t)))) t))))
(if (and url (string-prefix-p "summarize" gptel-model))
(list :url url)
(if (and (prop-match-p prop) (if (and (prop-match-p prop)
(prop-match-value prop)) (prop-match-value prop))
(user-error "No user prompt found!") (user-error "No user prompt found!")
(setq prompts (list (let ((prompts
:query
(if (prop-match-p prop)
(concat
;; Fake a system message by including it in the prompt
gptel--system-message "\n\n"
(string-trim (string-trim
(buffer-substring-no-properties (prop-match-beginning prop) (buffer-substring-no-properties (prop-match-beginning prop)
(prop-match-end prop)) (prop-match-end prop))
(format "[\t\r\n ]*\\(?:%s\\)?[\t\r\n ]*" (format "[\t\r\n ]*\\(?:%s\\)?[\t\r\n ]*"
(regexp-quote (gptel-prompt-prefix-string))) (regexp-quote (gptel-prompt-prefix-string)))
(format "[\t\r\n ]*\\(?:%s\\)?[\t\r\n ]*" (format "[\t\r\n ]*\\(?:%s\\)?[\t\r\n ]*"
(regexp-quote (gptel-response-prefix-string))))) (regexp-quote (gptel-response-prefix-string))))))
""))) (pcase-exhaustive gptel-model
prompts))) ("fastgpt"
(setq prompts (list
:query
(if (prop-match-p prop)
(concat
;; Fake a system message by including it in the prompt
gptel--system-message "\n\n" prompts)
""))))
((and model (guard (string-prefix-p "summarize" model)))
;; If the entire contents of the prompt looks like a url, send the url
;; Else send the text of the region
(setq prompts
(if-let (((prop-match-p prop))
(engine (substring model 10)))
;; It's a region of text
(list :text prompts)
""))))
prompts)))))
;;;###autoload ;;;###autoload
(cl-defun gptel-make-kagi (cl-defun gptel-make-kagi
(name &key stream key (name &key stream key
(host "kagi.com") (host "kagi.com")
(header (lambda () `(("Authorization" . ,(concat "Bot " (gptel--get-api-key)))))) (header (lambda () `(("Authorization" . ,(concat "Bot " (gptel--get-api-key))))))
(models '("fastgpt")) (models '("fastgpt"
"summarize:cecil" "summarize:agnes"
"summarize:daphne" "summarize:muriel"))
(protocol "https") (protocol "https")
(endpoint "/api/v0/fastgpt")) (endpoint "/api/v0/"))
"Register a Kagi FastGPT backend for gptel with NAME. "Register a Kagi FastGPT backend for gptel with NAME.
Keyword arguments: Keyword arguments:
@ -142,9 +165,11 @@ Example:
:models models :models models
:protocol protocol :protocol protocol
:endpoint endpoint :endpoint endpoint
:url (if protocol :url
(concat protocol "://" host endpoint) (lambda ()
(concat host endpoint))))) (concat protocol "://" host endpoint
(if (equal gptel-model "fastgpt")
"fastgpt" "summarize"))))))
(prog1 backend (prog1 backend
(setf (alist-get name gptel--known-backends (setf (alist-get name gptel--known-backends
nil nil #'equal) nil nil #'equal)

View file

@ -30,7 +30,7 @@
;; gptel is a simple Large Language Model chat client, with support for multiple models/backends. ;; gptel is a simple Large Language Model chat client, with support for multiple models/backends.
;; ;;
;; gptel supports ;; gptel supports
;; - The services ChatGPT, Azure, Gemini, and Kagi (FastGPT) ;; - The services ChatGPT, Azure, Gemini, and Kagi (FastGPT & Summarizer)
;; - Local models via Ollama, Llama.cpp, Llamafiles or GPT4All ;; - Local models via Ollama, Llama.cpp, Llamafiles or GPT4All
;; ;;
;; Additionally, any LLM service (local or remote) that provides an ;; Additionally, any LLM service (local or remote) that provides an