Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow chating with MemGPT agents from Slack like chat apps #1480

Open
distributev opened this issue Jun 26, 2024 · 7 comments
Open

Allow chating with MemGPT agents from Slack like chat apps #1480

distributev opened this issue Jun 26, 2024 · 7 comments
Labels

Comments

@distributev
Copy link

Allow the possibility to chat with MemGPT agents from an open source chat app (Slack like chat apps).
This will open a great deal of possibilities including the option to chat with MemGPT agents on the
go from android and iphone mobile phones (these chat apps have clients for android/iphone)
or even the possibility to "speak" with the MemGPT agents using the mobile phone.

The are many great open source chat apps like Element (matrix client), Mattermost or RocketChat

https://github.com/mattermost/mattermost
https://github.com/RocketChat/Rocket.Chat

https://www.matrix.org
https://github.com/matrix-org

Matrix wants to be like a common/reference API/implementation in the world of so many different chat apps.

Here are some projects which can be used as example

https://github.com/matrixgpt/matrix-chatgpt-bot
Talk to ChatGPT via any Matrix client!

https://github.com/alphapapa/ement.el
A Matrix client for GNU Emacs

https://github.com/yGuy/chatgpt-mattermost-bot
A very simple implementation of a service for a mattermost bot that uses ChatGPT
in the backend

https://github.com/attzonko/mmpy_bot
A python-based chatbot for Mattermost (http://www.mattermost.org)

@distributev
Copy link
Author

For instance

https://etke.cc/help/bots/chatgpt/

@sarahwooders
Copy link
Collaborator

This should be possible with MemGPT now -- you just have to have the application send messages to a MemGPT agent running on a server, and either replace or modify the send_message tool in MemGPT to instead send a message to the application you want the message to appear on.

@cpacker recently made a demo for how to connect a MemGPT agent to chat via a tamagotchi front-end, so that might be a helpful reference once its released (hopefully soon).

@scottmoney
Copy link

I think the question here is , could the messages endpoint implement the OpenAI chat parameters? That would allow all the available chat clients/packages out there to work out of the box.

@distributev
Copy link
Author

distributev commented Nov 10, 2024

Here

https://docs.letta.com/introduction#letta-api

it says

Letta API -> Stateful ChatCompletions API: an API designed to be a drop-in replacement for the standard ChatCompletions API

From this statement I understand that currently any Letta agent can be used as a "a drop-in replacement" in any of the existing chat clients/frontend which support the ChatCompletions API (including web apps which support ChatCompletions, Slack, Matrix or other chat bots which support the same, VS Code extensions/assistants which can connect to any OpenAI endpoint or Emacs clients which support the standard ChatCompletions API format)

Is my understanding correct?

@cpacker
Copy link
Collaborator

cpacker commented Nov 10, 2024

@distributev exactly - that's the high-level idea. Basically, many frontends / client applications have been written exclusively to support the /v1/chat/completions API (e.g. voice services like VAPI or LiveKit).

That API spec is stateless (meaning there's no memory management, so the client has to pass in the full agent state at each step), unlike Letta's API which is stateful (the server maintains the agent state, and the client only has to pass in a single message).

However, it's possible to connect a Letta server to services that expect a stateless API, and still get the power of a stateful API - you basically "pretend" like you're exposing a /v1/chat/completions API but under the hood make the API stateful. This allows you to connect something like LiveKit to a Letta server to communicate with a stateful Letta agent using voice.

We're currently working on this feature and hope for it to be released in the next few weeks (possibly sooner).

@distributev
Copy link
Author

"hope for it to be released in the next few weeks (possibly sooner)"

Are you referring to the LiveKit integration or the ChatCompletions API format standardization itself?

Copy link

This issue is stale because it has been open for 30 days with no activity.

@github-actions github-actions bot added the stale label Dec 31, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants
  NODES
chat 35
COMMUNITY 2
Idea 1
idea 1
Project 8
USERS 1