-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow chating with MemGPT agents from Slack like chat apps #1480
Comments
For instance |
This should be possible with MemGPT now -- you just have to have the application send messages to a MemGPT agent running on a server, and either replace or modify the @cpacker recently made a demo for how to connect a MemGPT agent to chat via a tamagotchi front-end, so that might be a helpful reference once its released (hopefully soon). |
I think the question here is , could the messages endpoint implement the OpenAI chat parameters? That would allow all the available chat clients/packages out there to work out of the box. |
Here https://docs.letta.com/introduction#letta-api it says Letta API -> Stateful ChatCompletions API: an API designed to be a drop-in replacement for the standard ChatCompletions API From this statement I understand that currently any Letta agent can be used as a "a drop-in replacement" in any of the existing chat clients/frontend which support the ChatCompletions API (including web apps which support ChatCompletions, Slack, Matrix or other chat bots which support the same, VS Code extensions/assistants which can connect to any OpenAI endpoint or Emacs clients which support the standard ChatCompletions API format) Is my understanding correct? |
@distributev exactly - that's the high-level idea. Basically, many frontends / client applications have been written exclusively to support the That API spec is stateless (meaning there's no memory management, so the client has to pass in the full agent state at each step), unlike Letta's API which is stateful (the server maintains the agent state, and the client only has to pass in a single message). However, it's possible to connect a Letta server to services that expect a stateless API, and still get the power of a stateful API - you basically "pretend" like you're exposing a We're currently working on this feature and hope for it to be released in the next few weeks (possibly sooner). |
"hope for it to be released in the next few weeks (possibly sooner)" Are you referring to the LiveKit integration or the ChatCompletions API format standardization itself? |
This issue is stale because it has been open for 30 days with no activity. |
Allow the possibility to chat with MemGPT agents from an open source chat app (Slack like chat apps).
This will open a great deal of possibilities including the option to chat with MemGPT agents on the
go from android and iphone mobile phones (these chat apps have clients for android/iphone)
or even the possibility to "speak" with the MemGPT agents using the mobile phone.
The are many great open source chat apps like Element (matrix client), Mattermost or RocketChat
https://github.com/mattermost/mattermost
https://github.com/RocketChat/Rocket.Chat
https://www.matrix.org
https://github.com/matrix-org
Matrix wants to be like a common/reference API/implementation in the world of so many different chat apps.
Here are some projects which can be used as example
https://github.com/matrixgpt/matrix-chatgpt-bot
Talk to ChatGPT via any Matrix client!
https://github.com/alphapapa/ement.el
A Matrix client for GNU Emacs
https://github.com/yGuy/chatgpt-mattermost-bot
A very simple implementation of a service for a mattermost bot that uses ChatGPT
in the backend
https://github.com/attzonko/mmpy_bot
A python-based chatbot for Mattermost (http://www.mattermost.org)
The text was updated successfully, but these errors were encountered: