Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Integrating MemGPT-like Functionality #2937
base: main
Are you sure you want to change the base?
Integrating MemGPT-like Functionality #2937
Changes from 10 commits
fd5adae
78be88f
efc1c55
9ee63e1
a4e2a18
507a67e
ccffdcc
4c3482f
4b1bf53
de1b94b
5fc4ce4
85a8f4b
2845c2c
7a8299b
a7a4c8a
0428dcc
bd9d14f
22e92d7
d2b1ae1
1afd574
c43ed97
6080071
515e038
44d3c9d
d93f5ee
0fece3f
d59be73
85b715d
7c5606d
d9b3aae
2a81073
140253c
c7a3713
754a9c3
490b192
2162c91
a90edc8
8d7bc30
34caa62
f30a572
13a9f64
d25e19e
File filter
Filter by extension
Conversations
Jump to
There are no files selected for viewing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you move this
.condense
function to theLLM
class, you can makemessages
as the input argument. We might need to make eachMessage
a Pydantic class with an additional attribute (e.g.,condensable
). This may also benefit the ongoing effort that tries to add vision capability in #2848 (comment) (cc @Kaushikdkrikhanu)There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This can be done. But on moving
.condense
toLLM
class, it won't be able to add summary tostate.history
directly. I might also need to passstate
as arg tollm.completion
along with messages.If we are trying to make it cleaner (#2937 (comment)), I instead was thinking of moving
.condense
toMemoryCondenser
class, after making eachMessage
a pydantic class. That's where I think.condenser
should belong 😅.WDYT? Which approach seems better?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
huh good point... maybe caller to
LLM
should make sure the summary got added to theState
- like what we are doing now?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
On a second thought: maybe
State
is not the place to add these new summarizing actions -EventStream
should be: https://github.com/OpenDevin/OpenDevin/blob/c555fb68400bb117011cda3e5d3d95beb5169000/opendevin/controller/agent_controller.py#L395Maybe you could add an optional argument for
LLM
class, likecondense_callback_fn
, which will be called with the condensedSummarized
action whenever a condensation is happening. And that callback will add thisSummarize
action to the event stream?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you elaborate on this? How does adding summarizing actions to EventStream help?
MemGPT stores only the summary of events from the start to some
event.id
. We only need to store one summary in the history. A summary event remains useful until we create a new summary, after which it becomes useless.Btw, I've moved the
.condenser
functions toLLM
class.MemoryCondenser
class used methods ofLLM
class, so I completely removed it and moved one of its method toLLM
class. This also achieves the target (#2937 (comment))There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Basically, going forward, we hope to make
EventStream
a single source of truth for all our past message / history (i.e., you get all the history messages from the event stream), so ideally all your condensation actions also should goes into event stream.But base on my understanding, your current implementation already emit "AgentSummarizeAction"? I think we just need to make sure that action got stored inside eventstream, we should be good to go on this bullet.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@xingyaoww Oh interesting, I played with summaries in the event stream at some point. A couple of thoughts:
Ref: https://opendevin.slack.com/archives/C06QKSD9UBA/p1716737811950479?thread_ts=1716668475.888269&channel=C06QKSD9UBA&message_ts=1716737811.950479
On a side note, making it only one in a run simplifies the logic either way.