gptel-anthropic: support Anthropic AI's Claude 3 (#229)
* gptel.el: Mention Anthropic in the package description. * gptel-anthropic.el (gptel-anthropic, gptel-make-anthropic, gptel--parse-response, gptel--request-data, gptel--parse-buffer, gptel-curl--parse-stream): Add support for Anthropic AI's Claude 3 models. * README.org: Add instructions for using Anthropic AI's Claude 3 models.
This commit is contained in:
parent
87c190076e
commit
eb088f2f21
3 changed files with 203 additions and 16 deletions
55
README.org
55
README.org
|
@ -4,20 +4,21 @@
|
|||
|
||||
GPTel is a simple Large Language Model chat client for Emacs, with support for multiple models and backends.
|
||||
|
||||
| LLM Backend | Supports | Requires |
|
||||
|-----------------+----------+---------------------------|
|
||||
| ChatGPT | ✓ | [[https://platform.openai.com/account/api-keys][API key]] |
|
||||
| Azure | ✓ | Deployment and API key |
|
||||
| Ollama | ✓ | [[https://ollama.ai/][Ollama running locally]] |
|
||||
| GPT4All | ✓ | [[https://gpt4all.io/index.html][GPT4All running locally]] |
|
||||
| Gemini | ✓ | [[https://makersuite.google.com/app/apikey][API key]] |
|
||||
| Llama.cpp | ✓ | [[https://github.com/ggerganov/llama.cpp/tree/master/examples/server#quick-start][Llama.cpp running locally]] |
|
||||
| Llamafile | ✓ | [[https://github.com/Mozilla-Ocho/llamafile#quickstart][Local Llamafile server]] |
|
||||
| Kagi FastGPT | ✓ | [[https://kagi.com/settings?p=api][API key]] |
|
||||
| Kagi Summarizer | ✓ | [[https://kagi.com/settings?p=api][API key]] |
|
||||
| together.ai | ✓ | [[https://api.together.xyz/settings/api-keys][API key]] |
|
||||
| Anyscale | ✓ | [[https://docs.endpoints.anyscale.com/][API key]] |
|
||||
| Perplexity | ✓ | [[https://docs.perplexity.ai/docs/getting-started][API key]] |
|
||||
| LLM Backend | Supports | Requires |
|
||||
|--------------------+----------+---------------------------|
|
||||
| ChatGPT | ✓ | [[https://platform.openai.com/account/api-keys][API key]] |
|
||||
| Azure | ✓ | Deployment and API key |
|
||||
| Ollama | ✓ | [[https://ollama.ai/][Ollama running locally]] |
|
||||
| GPT4All | ✓ | [[https://gpt4all.io/index.html][GPT4All running locally]] |
|
||||
| Gemini | ✓ | [[https://makersuite.google.com/app/apikey][API key]] |
|
||||
| Llama.cpp | ✓ | [[https://github.com/ggerganov/llama.cpp/tree/master/examples/server#quick-start][Llama.cpp running locally]] |
|
||||
| Llamafile | ✓ | [[https://github.com/Mozilla-Ocho/llamafile#quickstart][Local Llamafile server]] |
|
||||
| Kagi FastGPT | ✓ | [[https://kagi.com/settings?p=api][API key]] |
|
||||
| Kagi Summarizer | ✓ | [[https://kagi.com/settings?p=api][API key]] |
|
||||
| together.ai | ✓ | [[https://api.together.xyz/settings/api-keys][API key]] |
|
||||
| Anyscale | ✓ | [[https://docs.endpoints.anyscale.com/][API key]] |
|
||||
| Perplexity | ✓ | [[https://docs.perplexity.ai/docs/getting-started][API key]] |
|
||||
| Anthropic (Claude) | ✓ | [[https://www.anthropic.com/api][API key]] |
|
||||
|
||||
*General usage*: ([[https://www.youtube.com/watch?v=bsRnh_brggM][YouTube Demo]])
|
||||
|
||||
|
@ -57,6 +58,7 @@ GPTel uses Curl if available, but falls back to url-retrieve to work without ext
|
|||
- [[#togetherai][together.ai]]
|
||||
- [[#anyscale][Anyscale]]
|
||||
- [[#perplexity][Perplexity]]
|
||||
- [[#anthropic-claude][Anthropic (Claude)]]
|
||||
- [[#usage][Usage]]
|
||||
- [[#in-any-buffer][In any buffer:]]
|
||||
- [[#in-a-dedicated-chat-buffer][In a dedicated chat buffer:]]
|
||||
|
@ -438,6 +440,31 @@ The above code makes the backend available to select. If you want it to be the
|
|||
"pplx-70b-online")))
|
||||
#+end_src
|
||||
|
||||
#+html: </details>
|
||||
#+html: <details><summary>
|
||||
**** Anthropic (Claude)
|
||||
#+html: </summary>
|
||||
Register a backend with
|
||||
#+begin_src emacs-lisp
|
||||
(gptel-make-anthropic "Claude" ;Any name you want
|
||||
:stream t ;Streaming responses
|
||||
:key "your-api-key")
|
||||
#+end_src
|
||||
The =:key= can be a function that returns the key (more secure).
|
||||
|
||||
You can pick this backend from the menu when using gptel (see [[#usage][Usage]]).
|
||||
|
||||
***** (Optional) Set as the default gptel backend
|
||||
|
||||
The above code makes the backend available to select. If you want it to be the default backend for gptel, you can set this as the default value of =gptel-backend=. Use this instead of the above.
|
||||
#+begin_src emacs-lisp
|
||||
;; OPTIONAL configuration
|
||||
(setq-default
|
||||
gptel-model "claude-3-sonnet-20240229" ; "claude-3-opus-20240229" also available
|
||||
gptel-backend (gptel-make-anthropic "Claude"
|
||||
:stream t :key "your-api-key"))
|
||||
#+end_src
|
||||
|
||||
#+html: </details>
|
||||
|
||||
** Usage
|
||||
|
|
156
gptel-anthropic.el
Normal file
156
gptel-anthropic.el
Normal file
|
@ -0,0 +1,156 @@
|
|||
;;; gptel-anthropic.el --- Anthropic AI suppport for gptel -*- lexical-binding: t; -*-
|
||||
|
||||
;; Copyright (C) 2023 Karthik Chikmagalur
|
||||
|
||||
;; Author: Karthik Chikmagalur <karthikchikmagalur@gmail.com>
|
||||
|
||||
;; This program is free software; you can redistribute it and/or modify
|
||||
;; it under the terms of the GNU General Public License as published by
|
||||
;; the Free Software Foundation, either version 3 of the License, or
|
||||
;; (at your option) any later version.
|
||||
|
||||
;; This program is distributed in the hope that it will be useful,
|
||||
;; but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
;; MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
;; GNU General Public License for more details.
|
||||
|
||||
;; You should have received a copy of the GNU General Public License
|
||||
;; along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
|
||||
;;; Commentary:
|
||||
|
||||
;; This file adds support for Anthropic's Messages API to gptel
|
||||
|
||||
;;; Code:
|
||||
(require 'cl-generic)
|
||||
(eval-when-compile
|
||||
(require 'cl-lib))
|
||||
(require 'map)
|
||||
(require 'gptel)
|
||||
|
||||
(defvar json-object-type)
|
||||
|
||||
(declare-function prop-match-value "text-property-search")
|
||||
(declare-function text-property-search-backward "text-property-search")
|
||||
(declare-function json-read "json")
|
||||
|
||||
;;; Anthropic (Messages API)
|
||||
(cl-defstruct (gptel-anthropic (:constructor gptel--make-anthropic)
|
||||
(:copier nil)
|
||||
(:include gptel-backend)))
|
||||
|
||||
(cl-defmethod gptel-curl--parse-stream ((_backend gptel-anthropic) _info)
|
||||
(let* ((json-object-type 'plist)
|
||||
(content-strs))
|
||||
(condition-case nil
|
||||
(while (re-search-forward "^event: " nil t)
|
||||
(cond
|
||||
((looking-at "content_block_\\(?:start\\|delta\\|stop\\)")
|
||||
(save-match-data
|
||||
(forward-line 1) (forward-char 5)
|
||||
(when-let* ((response (json-read))
|
||||
(content (map-nested-elt
|
||||
response '(:delta :text))))
|
||||
(push content content-strs))))))
|
||||
(error
|
||||
(goto-char (match-beginning 0))))
|
||||
(apply #'concat (nreverse content-strs))))
|
||||
|
||||
(cl-defmethod gptel--parse-response ((_backend gptel-anthropic) response _info)
|
||||
(with-current-buffer (get-buffer "*gptel-log*")
|
||||
(princ response))
|
||||
(map-nested-elt response '(:content 0 :text)))
|
||||
|
||||
(cl-defmethod gptel--request-data ((_backend gptel-anthropic) prompts)
|
||||
"JSON encode PROMPTS for sending to ChatGPT."
|
||||
(let ((prompts-plist
|
||||
`(:model ,gptel-model
|
||||
:messages [,@prompts]
|
||||
:system ,gptel--system-message
|
||||
:stream ,(or (and gptel-stream gptel-use-curl
|
||||
(gptel-backend-stream gptel-backend))
|
||||
:json-false)
|
||||
:max_tokens ,(or gptel-max-tokens 1024))))
|
||||
(when gptel-temperature
|
||||
(plist-put prompts-plist :temperature gptel-temperature))
|
||||
prompts-plist))
|
||||
|
||||
(cl-defmethod gptel--parse-buffer ((_backend gptel-anthropic) &optional max-entries)
|
||||
(let ((prompts) (prop))
|
||||
(while (and
|
||||
(or (not max-entries) (>= max-entries 0))
|
||||
(setq prop (text-property-search-backward
|
||||
'gptel 'response
|
||||
(when (get-char-property (max (point-min) (1- (point)))
|
||||
'gptel)
|
||||
t))))
|
||||
(push (list :role (if (prop-match-value prop) "assistant" "user")
|
||||
:content
|
||||
(string-trim
|
||||
(buffer-substring-no-properties (prop-match-beginning prop)
|
||||
(prop-match-end prop))
|
||||
(format "[\t\r\n ]*\\(?:%s\\)?[\t\r\n ]*"
|
||||
(regexp-quote (gptel-prompt-prefix-string)))
|
||||
(format "[\t\r\n ]*\\(?:%s\\)?[\t\r\n ]*"
|
||||
(regexp-quote (gptel-response-prefix-string)))))
|
||||
prompts)
|
||||
(and max-entries (cl-decf max-entries)))
|
||||
prompts))
|
||||
|
||||
;;;###autoload
|
||||
(cl-defun gptel-make-anthropic
|
||||
(name &key curl-args stream key
|
||||
(header
|
||||
(lambda () (when-let (key (gptel--get-api-key))
|
||||
`(("x-api-key" . ,key)
|
||||
("anthropic-version" . "2023-06-01")))))
|
||||
(models '("claude-3-sonnet-20240229" "claude-3-opus-20240229"))
|
||||
(host "api.anthropic.com")
|
||||
(protocol "https")
|
||||
(endpoint "/v1/messages"))
|
||||
"Register an Anthropic API-compatible backend for gptel with NAME.
|
||||
|
||||
Keyword arguments:
|
||||
|
||||
CURL-ARGS (optional) is a list of additional Curl arguments.
|
||||
|
||||
HOST (optional) is the API host, \"api.anthropic.com\" by default.
|
||||
|
||||
MODELS is a list of available model names.
|
||||
|
||||
STREAM is a boolean to toggle streaming responses, defaults to
|
||||
false.
|
||||
|
||||
PROTOCOL (optional) specifies the protocol, https by default.
|
||||
|
||||
ENDPOINT (optional) is the API endpoint for completions, defaults to
|
||||
\"/v1/messages\".
|
||||
|
||||
HEADER (optional) is for additional headers to send with each
|
||||
request. It should be an alist or a function that retuns an
|
||||
alist, like:
|
||||
((\"Content-Type\" . \"application/json\"))
|
||||
|
||||
KEY is a variable whose value is the API key, or function that
|
||||
returns the key."
|
||||
(declare (indent 1))
|
||||
(let ((backend (gptel--make-anthropic
|
||||
:curl-args curl-args
|
||||
:name name
|
||||
:host host
|
||||
:header header
|
||||
:key key
|
||||
:models models
|
||||
:protocol protocol
|
||||
:endpoint endpoint
|
||||
:stream stream
|
||||
:url (if protocol
|
||||
(concat protocol "://" host endpoint)
|
||||
(concat host endpoint)))))
|
||||
(prog1 backend
|
||||
(setf (alist-get name gptel--known-backends
|
||||
nil nil #'equal)
|
||||
backend))))
|
||||
|
||||
(provide 'gptel-anthropic)
|
||||
;;; gptel-backends.el ends here
|
8
gptel.el
8
gptel.el
|
@ -30,7 +30,9 @@
|
|||
;; gptel is a simple Large Language Model chat client, with support for multiple models/backends.
|
||||
;;
|
||||
;; gptel supports
|
||||
;; - The services ChatGPT, Azure, Gemini, Anyscale, Together.ai and Kagi (FastGPT & Summarizer)
|
||||
;;
|
||||
;; - The services ChatGPT, Azure, Gemini, Anthropic AI, Anyscale, Together.ai,
|
||||
;; Perplexity, and Kagi (FastGPT & Summarizer)
|
||||
;; - Local models via Ollama, Llama.cpp, Llamafiles or GPT4All
|
||||
;;
|
||||
;; Additionally, any LLM service (local or remote) that provides an
|
||||
|
@ -54,7 +56,9 @@
|
|||
;;
|
||||
;; - For Azure: define a gptel-backend with `gptel-make-azure', which see.
|
||||
;; - For Gemini: define a gptel-backend with `gptel-make-gemini', which see.
|
||||
;; - For Kagi: define a gptel-backend with `gptel-make-kagi', which see
|
||||
;; - For Anthropic (Claude): define a gptel-backend with `gptel-make-anthropic',
|
||||
;; which see
|
||||
;; - For Kagi: define a gptel-backend with `gptel-make-kagi', which see.
|
||||
;;
|
||||
;; For local models using Ollama, Llama.cpp or GPT4All:
|
||||
;;
|
||||
|
|
Loading…
Add table
Reference in a new issue