Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix agent not saving chat history for regular msg (without tool-calli… #1281

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

irevived1
Copy link
Contributor

…ng) when streaming

Copy link

changeset-bot bot commented Oct 1, 2024

🦋 Changeset detected

Latest commit: 805a3f3

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 18 packages
Name Type
@llamaindex/core Minor
@llamaindex/cloud Major
@llamaindex/community Patch
llamaindex Patch
@llamaindex/ollama Patch
@llamaindex/openai Patch
@llamaindex/llama-parse-browser-test Patch
docs Patch
@llamaindex/autotool Patch
@llamaindex/experimental Patch
@llamaindex/cloudflare-worker-agent-test Patch
@llamaindex/next-agent-test Patch
@llamaindex/nextjs-edge-runtime-test Patch
@llamaindex/next-node-runtime-test Patch
@llamaindex/waku-query-engine-test Patch
@llamaindex/autotool-01-node-example Patch
@llamaindex/autotool-02-next-example Patch
@llamaindex/groq Patch

Not sure what this means? Click here to learn what changesets are.

[Click here if you're a maintainer who wants to add another changeset to this PR](https://github.com/irevived1/LlamaIndexTS/new/fix/openai_streaming_msg_history?filename=.changeset/warm-mangos-marry.md&value=---%0A%22%40llamaindex%2Fcore%22%3A%20patch%0A---%0A%0Afix%20agent%20not%20saving%20chat%20history%20for%20regular%20msg%20(without%20tool-calli%E2%80%A6%0A)

Copy link

vercel bot commented Oct 1, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
llama-index-ts-docs ✅ Ready (Inspect) Visit Preview 💬 Add feedback Oct 1, 2024 0:54am
1 Skipped Deployment
Name Status Preview Comments Updated (UTC)
llamaindexts-doc-experimental ⬜️ Ignored (Inspect) Visit Preview Oct 1, 2024 0:54am

for await (const chunk of pipStream) {
content += chunk.delta;
}
step.context.store.messages = [
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was thinking message is only for input, we won't update

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi, when we set stream: false, we do store the output in the messages array.
Do we plan to remove output msg from stepTools (non-streaming) for consistency?
see:

step.context.store.messages = [

Do you have a recommended function/property to get the output?
Thanks!

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, I remmeber why we cannot store the message when stream: true, because if you consume the stream, user will lost the delta. We can only pass to the user and only when they consumed we can store here. So I think the best practice should be handled on inner module, not here

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the explanation.
Would you be able to do that in the inner module? I can close the current PR.
Or I can update it too if you can point me to the filename/location. TY

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you can handle the logic before controller.close() in this file.

Copy link
Contributor Author

@irevived1 irevived1 Oct 1, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like context is already assigned at


before controller.close() at
https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/agent/utils.ts#L56

any context assignment after is not saved.
Do you have other suggestions?

Update: I've tested the current solution and it seems like the user is able to consume the stream.
Feel free to checkout this branch for testing

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants