Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Weave gives Validation Error using Custom Chat Template #2444

Open
AIApprentice101 opened this issue Sep 20, 2024 · 5 comments
Open

Weave gives Validation Error using Custom Chat Template #2444

AIApprentice101 opened this issue Sep 20, 2024 · 5 comments

Comments

@AIApprentice101
Copy link

AIApprentice101 commented Sep 20, 2024

Thank you for the package. Love it.

I use Langchain's ChatOpenAI function with stream=True. When I add Weave to trace the chains by simply `weave.init()', it throws the following error:

([llm/error] [chain:RunnableWithMessageHistory > chain:check_sync_or_async > chain:RunnableSequence > chain:ChatOpenAI > llm:ChatOpenAI] [2.11s] LLM run errored with error:
"1 validation error for ChatCompletion\nchoices.0.message.role\n Field required [type=missing, input_value={'content': " The text ap...2.", 'tool_calls': None}, input_type=dict]\n For further information visit https://errors.pydantic.dev/2.9/v/missingTraceback (most recent call last):\n\n\n File "/opt/conda/envs/testweave/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 932, in _agenerate_with_cache\n result = await self._agenerate(\n\n\n File "/opt/conda/envs/testweave/lib/python3.10/site-packages/langchain_openai/chat_models/base.py", line 809, in _agenerate\n return await agenerate_from_stream(stream_iter)\n\n\n File "/opt/conda/envs/testweave/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 118, in agenerate_from_stream\n chunks = [chunk async for chunk in stream]\n\n\n File "/opt/conda/envs/testweave/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 118, in \n chunks = [chunk async for chunk in stream]\n\n\n File "/opt/conda/envs/testweave/lib/python3.10/site-packages/langchain_openai/chat_models/base.py", line 1989, in _astream\n async for chunk in super()._astream(*args, **kwargs):\n\n\n File "/opt/conda/envs/testweave/lib/python3.10/site-packages/langchain_openai/chat_models/base.py", line 779, in _astream\n async for chunk in response:\n\n\n File "/opt/conda/envs/testweave/lib/python3.10/site-packages/weave/legacy/monitoring/openai/openai.py", line 79, in aiter\n self._on_finish()\n\n\n File "/opt/conda/envs/testweave/lib/python3.10/site-packages/weave/legacy/monitoring/openai/openai.py", line 87, in _on_finish\n result = wrapped_stream.final_response()\n\n\n File "/opt/conda/envs/testweave/lib/python3.10/site-packages/weave/flow/chat_util.py", line 104, in final_response\n return ChatCompletion(\n\n\n File "/opt/conda/envs/testweave/lib/python3.10/site-packages/pydantic/main.py", line 209, in init\n validated_self = self.pydantic_validator.validate_python(data, self_instance=self)\n\n\npydantic_core._pydantic_core.ValidationError: 1 validation error for ChatCompletion\nchoices.0.message.role\n Field required [type=missing, input_value={'content': " The text ap...2.", 'tool_calls': None}, input_type=dict]\n For further information visit https://errors.pydantic.dev/2.9/v/missing")

Package info:
langchain = "0.3.0"
langchain-openai = "0.2.0"
weave = "0.50.7"

@AIApprentice101 AIApprentice101 changed the title Langchain-Weave give Validation Error Langchain-Weave gives Validation Error Sep 20, 2024
@gtarpenning
Copy link
Member

Hey @AIApprentice101 Thanks for writing in! Your weave package looks quite a bit out of date, do you mind updating and trying again? Happy to help if the issue persists :)

@AIApprentice101
Copy link
Author

Thank you. I just tried version 0.51.8 which I think is the most recent release. Unfortunately, it still throws the same validation error.

@gtarpenning
Copy link
Member

Ticketing for internal tracking, I will keep you posted!

@gtarpenning
Copy link
Member

Do you mind providing a code sample so that I can reproduce the issue? @AIApprentice101

@AIApprentice101
Copy link
Author

AIApprentice101 commented Sep 26, 2024

Thank you for your reply. I think I figure out where the issue is from. I'm using the Mistral-7B model which doesn't have role "system". I used a custom chat template that basically converts "system" message to "user". Without weave, this works fine. With weave enabled, it throws a Pydantic validation error. This also only happens with streaming=True.

from langchain_openai import ChatOpenAI
import weave
weave.init('langchain-demo')

OPENAI_API_KEY="EMPTY"
OPENAI_API_BASE="http://localhost:8000/v1"
LLM_MODEL_NAME = "TheBloke/Mistral-7B-Instruct-v0.2-AWQ"
llm = ChatOpenAI(
    openai_api_key=OPENAI_API_KEY,
    openai_api_base=OPENAI_API_BASE,
    model_name=LLM_MODEL_NAME,
    streaming=True
)
# validation error
response1 = llm.invoke("Hi, my name is Alice.")
# validation error
for chunk in llm.stream("Hi, my name is Alice."):
    print(chunk.content, end="", flush=True)

If you think this is not a general case, please feel free to close this issue. Any help would be very appreciated. Thank you.

@AIApprentice101 AIApprentice101 changed the title Langchain-Weave gives Validation Error Weave gives Validation Error using Custom Chat Template Sep 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants