-
Notifications
You must be signed in to change notification settings - Fork 345
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix agent not saving chat history for regular msg (without tool-calli… #1281
base: main
Are you sure you want to change the base?
fix agent not saving chat history for regular msg (without tool-calli… #1281
Conversation
…ng) when streaming
🦋 Changeset detectedLatest commit: 805a3f3 The changes in this PR will be included in the next version bump. This PR includes changesets to release 18 packages
Not sure what this means? Click here to learn what changesets are. [Click here if you're a maintainer who wants to add another changeset to this PR](https://github.com/irevived1/LlamaIndexTS/new/fix/openai_streaming_msg_history?filename=.changeset/warm-mangos-marry.md&value=---%0A%22%40llamaindex%2Fcore%22%3A%20patch%0A---%0A%0Afix%20agent%20not%20saving%20chat%20history%20for%20regular%20msg%20(without%20tool-calli%E2%80%A6%0A) |
The latest updates on your projects. Learn more about Vercel for Git ↗︎
1 Skipped Deployment
|
for await (const chunk of pipStream) { | ||
content += chunk.delta; | ||
} | ||
step.context.store.messages = [ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was thinking message is only for input, we won't update
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi, when we set stream: false, we do store the output in the messages array.
Do we plan to remove output msg from stepTools (non-streaming) for consistency?
see:
LlamaIndexTS/packages/core/src/agent/utils.ts
Line 153 in 805a3f3
step.context.store.messages = [ |
Do you have a recommended function/property to get the output?
Thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, I remmeber why we cannot store the message when stream: true, because if you consume the stream, user will lost the delta. We can only pass to the user and only when they consumed we can store here. So I think the best practice should be handled on inner module, not here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the explanation.
Would you be able to do that in the inner module? I can close the current PR.
Or I can update it too if you can point me to the filename/location. TY
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think you can handle the logic before controller.close()
in this file.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like context is already assigned at
context = { |
before controller.close() at
https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/agent/utils.ts#L56
any context assignment after is not saved.
Do you have other suggestions?
Update: I've tested the current solution and it seems like the user is able to consume the stream.
Feel free to checkout this branch for testing
…ng) when streaming