Replies: 3 comments 6 replies
-
Thanks for cc'ing me @karthink! I've only just caught up adding multi-llm support to chatgpt-shell and did indeed re-invent the wheel :/
This sounds great.
Looks like it's moving fast already. Looking forward to seeing usages in practice. I may give it a little time to settle and see how the general user experiences develops. For example, I've yet to add things like function calling... which I'm guessing gets obsoleted if MCP catches on? In the case of chatgpt-shell, it took a few iterations to get a comfortable user experience. I did add (and discard) plenty of things. I'm guessing this may have been the case for other folks building Emacs packages. Either way, I'll be following this closely :) |
Beta Was this translation helpful? Give feedback.
-
Thanks for starting this conversation! Is the idea to use Emacs as a LSP host, or an LSP server? An LSP server could be interesting, but emacs has too many "capabilities", and it isn't clear to me how to best use it as a server, aside from maybe basic buffer listing and buffer content commands. As an LSP host, it's more intersting - Emacs could discover and use various sources of data and presumably pass them to . Right now it's similar to what you can do with function calling, but should be easier with service discovery (instead of having to write your own). This is something I'm interesting in, but don't have any plans to work on. I have been thinking of pursuing a similar sort of standardization for context building in the I do think that if this happens, and it needs to call out to llms, that should be done via the At any rate, let's keep this thread alive so we can coordinate and either work together or at least not do duplicate work! |
Beta Was this translation helpful? Give feedback.
-
While I've added some helpers to facilitate manually building context (usually feeding into M-x chatgpt-shell-prompt-compose), this is still an area of friction. I've been tempted to prototype a smart context core of sorts, yet it feels like an entirely new/separate project, which I may not have the time for. Possibly run as a separate process/daemon. Maybe if I wait long enough, it gets obsoleted/solved by mcp 😅 |
Beta Was this translation helpful? Give feedback.
-
@ahyatt, I wanted to check if you're working on a Model Context Protocol (MCP) adapter for llm. It was described by the author as "LSP but for LLM assistants". If support for it catches on, it should make adding context to LLM queries easier in a semi-standardized way.
Last time around, we ended up reinventing the wheel with llm/gptel/chatgpt-shell etc. I'm hoping we can write a generic
mcp.el
library this time that has no (or few) dependencies, and use it as appropriate in different Emacs LLM client UIs. Of course, this is only relevant if the open protocol catches on, since it's only supported by Anthropic and a few editors like Zed right now.I was planning to look into writing something early next year, assuming people continue to write MCP servers. I haven't read the full spec yet, just have a rough idea so far. But if you already have some plans or are writing something like
mcp.el
, I'd be happy to help.CC'ing @xenodium and @s-kostyaev for thoughts and input.
Beta Was this translation helpful? Give feedback.
All reactions