README: Mention llm and Ellama

* gptel.el (header): Add email

* README.org (Alternatives): Mention llm and Ellama
This commit is contained in:
Karthik Chikmagalur 2024-05-03 08:41:10 -07:00
parent 69fb2f09f3
commit 8ccdc31b12
2 changed files with 4 additions and 1 deletions

View file

@ -796,6 +796,8 @@ Features being considered or in the pipeline:
Other Emacs clients for LLMs include
- [[https://github.com/ahyatt/llm][llm]]: llm provides a uniform API across language model providers for building LLM clients in Emacs, and is intended as a library for use by package authors. For similar scripting purposes, gptel provides the command =gptel-request=, which see.
- [[https://github.com/s-kostyaev/ellama][Ellama]]: A full-fledged LLM client built on llm, that supports many LLM providers (Ollama, Open AI, Vertex, GPT4All and more). Its usage differs from gptel in that it provides separate commands for dozens of common tasks, like general chat, summarizing code/text, refactoring code, improving grammar, translation and so on.
- [[https://github.com/xenodium/chatgpt-shell][chatgpt-shell]]: comint-shell based interaction with ChatGPT. Also supports DALL-E, executable code blocks in the responses, and more.
- [[https://github.com/rksm/org-ai][org-ai]]: Interaction through special =#+begin_ai ... #+end_ai= Org-mode blocks. Also supports DALL-E, querying ChatGPT with the contents of project files, and more.

View file

@ -2,7 +2,7 @@
;; Copyright (C) 2023 Karthik Chikmagalur
;; Author: Karthik Chikmagalur
;; Author: Karthik Chikmagalur <karthik.chikmagalur@gmail.com>
;; Version: 0.8.6
;; Package-Requires: ((emacs "27.1") (transient "0.4.0") (compat "29.1.4.1"))
;; Keywords: convenience
@ -1106,6 +1106,7 @@ the response is inserted into the current buffer after point."
(encode-coding-string
(gptel--json-encode (plist-get info :data))
'utf-8)))
;; why do these checks not occur inside of `gptel--log'?
(when gptel-log-level ;logging
(when (eq gptel-log-level 'debug)
(gptel--log (gptel--json-encode