Skip to content
This repository has been archived by the owner on Oct 10, 2024. It is now read-only.

join in ok, connected, llm ,tts config ok, but no Conversation #13

Open
themass opened this issue Aug 26, 2024 · 7 comments
Open

join in ok, connected, llm ,tts config ok, but no Conversation #13

themass opened this issue Aug 26, 2024 · 7 comments

Comments

@themass
Copy link

themass commented Aug 26, 2024

I run 01-local demo for server and rtvi-web-demo for client

2024-08-26 17:13:11.097 | INFO | pipecat.transports.services.daily:on_participant_joined:468 - Participant joined a3b53542-cc5e-4a6a-b800-e84091fdbb1e
2024-08-26 17:13:11.098 | INFO | main:on_first_participant_joined:61 - First participant joined
2024-08-26 17:13:11.192 | INFO | main:on_call_state_updated:70 - Call state joined
2024-08-26 17:13:12.951 | INFO | pipecat.transports.services.daily:join:272 - Joined xxxxxxx
2024-08-26 17:13:12.952 | INFO | pipecat.transports.services.daily:_start_transcription:288 - Enabling transcription with settings language='en' tier='nova' model='2-conversationalai' profanity_filter=True redact=False endpointing=True punctuate=True includeRawResponse=True extra={'interim_results': True}
2024-08-26 17:13:13.725 | DEBUG | pipecat.transports.services.daily:on_transcription_started:495 - Transcription started: {'startedBy': '48880c13-be10-4edb-90ed-e28fc87c1bdd', 'language': 'en', 'tier': 'nova', 'transcriptId': '84c5546f-d9a4-421f-8fa5-4178bfbe71c7', 'model': '2-conversationalai'}
2024-08-26 17:13:13.742 | DEBUG | pipecat.processors.frame_processor:link:133 - Linking PipelineSource#1 -> LLMUserResponseAggregator#0
2024-08-26 17:13:13.742 | DEBUG | pipecat.processors.frame_processor:link:133 - Linking LLMUserResponseAggregator#0 -> LLM
2024-08-26 17:13:13.743 | DEBUG | pipecat.processors.frame_processor:link:133 - Linking LLM -> FunctionCaller#0
2024-08-26 17:13:13.743 | DEBUG | pipecat.processors.frame_processor:link:133 - Linking FunctionCaller#0 -> TTS
2024-08-26 17:13:13.743 | DEBUG | pipecat.processors.frame_processor:link:133 - Linking TTS -> RTVITTSTextProcessor#0
2024-08-26 17:13:13.743 | DEBUG | pipecat.processors.frame_processor:link:133 - Linking RTVITTSTextProcessor#0 -> LLMAssistantResponseAggregator#0
2024-08-26 17:13:13.743 | DEBUG | pipecat.processors.frame_processor:link:133 - Linking LLMAssistantResponseAggregator#0 -> DailyOutputTransport#0
2024-08-26 17:13:13.743 | DEBUG | pipecat.processors.frame_processor:link:133 - Linking DailyOutputTransport#0 -> PipelineSink#1
2024-08-26 17:13:13.743 | DEBUG | pipecat.processors.frame_processor:link:133 - Linking Pipeline#0 -> Pipeline#1
2024-08-26 17:13:14.879 | DEBUG | pipecat.services.openai:_stream_chat_completions:103 - Generating chat: [{"content": "You are Chatbot, a friendly, helpful robot. Your output will be converted to audio so don't include special characters other than '!' or '?' in your answers. Respond to what the user said in a creative and helpful way, but keep your responses brief. Start by saying hello.", "role": "system", "name": "system"}, {"content": "Greet the user", "role": "assistant", "name": "assistant"}]
2024-08-26 17:13:16.276 | DEBUG | pipecat.processors.frame_processor:stop_ttfb_metrics:40 - LLM TTFB: 1.3974647521972656
2024-08-26 17:13:16.317 | DEBUG | pipecat.services.cartesia:run_tts:192 - Generating TTS: [Hello!]
2024-08-26 17:13:16.318 | DEBUG | pipecat.processors.frame_processor:stop_processing_metrics:56 - TTS processing time: 0.0007891654968261719
2024-08-26 17:13:16.374 | DEBUG | pipecat.services.cartesia:run_tts:192 - Generating TTS: [How can I assist you today?]
2024-08-26 17:13:16.374 | DEBUG | pipecat.processors.frame_processor:stop_processing_metrics:56 - TTS processing time: 0.00040912628173828125
2024-08-26 17:13:16.375 | DEBUG | pipecat.processors.frame_processor:stop_processing_metrics:56 - LLM processing time: 1.496906042098999
2024-08-26 17:13:16.688 | DEBUG | pipecat.processors.frame_processor:stop_ttfb_metrics:40 - TTS TTFB: 0.3707418441772461
{"timestamp":"2024-08-26T09:14:27.881769Z","level":"ERROR","fields":{"message":"send transport changed to failed"},"target":"daily_core::soup::sfu::mediasoup_manager"}
{"timestamp":"2024-08-26T09:14:28.151822Z","level":"ERROR","fields":{"message":"recv transport changed to failed"},"target":"daily_core::soup::sfu::mediasoup_manager"}

@themass
Copy link
Author

themass commented Aug 26, 2024

image

@sadimoodi
Copy link

I run 01-local demo for server and rtvi-web-demo for client

2024-08-26 17:13:11.097 | INFO | pipecat.transports.services.daily:on_participant_joined:468 - Participant joined a3b53542-cc5e-4a6a-b800-e84091fdbb1e 2024-08-26 17:13:11.098 | INFO | main:on_first_participant_joined:61 - First participant joined 2024-08-26 17:13:11.192 | INFO | main:on_call_state_updated:70 - Call state joined 2024-08-26 17:13:12.951 | INFO | pipecat.transports.services.daily:join:272 - Joined xxxxxxx 2024-08-26 17:13:12.952 | INFO | pipecat.transports.services.daily:_start_transcription:288 - Enabling transcription with settings language='en' tier='nova' model='2-conversationalai' profanity_filter=True redact=False endpointing=True punctuate=True includeRawResponse=True extra={'interim_results': True} 2024-08-26 17:13:13.725 | DEBUG | pipecat.transports.services.daily:on_transcription_started:495 - Transcription started: {'startedBy': '48880c13-be10-4edb-90ed-e28fc87c1bdd', 'language': 'en', 'tier': 'nova', 'transcriptId': '84c5546f-d9a4-421f-8fa5-4178bfbe71c7', 'model': '2-conversationalai'} 2024-08-26 17:13:13.742 | DEBUG | pipecat.processors.frame_processor🔗133 - Linking PipelineSource#1 -> LLMUserResponseAggregator#0 2024-08-26 17:13:13.742 | DEBUG | pipecat.processors.frame_processor:link:133 - Linking LLMUserResponseAggregator#0 -> LLM 2024-08-26 17:13:13.743 | DEBUG | pipecat.processors.frame_processor:link:133 - Linking LLM -> FunctionCaller#0 2024-08-26 17:13:13.743 | DEBUG | pipecat.processors.frame_processor:link:133 - Linking FunctionCaller#0 -> TTS 2024-08-26 17:13:13.743 | DEBUG | pipecat.processors.frame_processor:link:133 - Linking TTS -> RTVITTSTextProcessor#0 2024-08-26 17:13:13.743 | DEBUG | pipecat.processors.frame_processor:link:133 - Linking RTVITTSTextProcessor#0 -> LLMAssistantResponseAggregator#0 2024-08-26 17:13:13.743 | DEBUG | pipecat.processors.frame_processor:link:133 - Linking LLMAssistantResponseAggregator#0 -> DailyOutputTransport#0 2024-08-26 17:13:13.743 | DEBUG | pipecat.processors.frame_processor:link:133 - Linking DailyOutputTransport#0 -> PipelineSink#1 2024-08-26 17:13:13.743 | DEBUG | pipecat.processors.frame_processor:link:133 - Linking Pipeline#0 -> Pipeline#1 2024-08-26 17:13:14.879 | DEBUG | pipecat.services.openai:_stream_chat_completions:103 - Generating chat: [{"content": "You are Chatbot, a friendly, helpful robot. Your output will be converted to audio so don't include special characters other than '!' or '?' in your answers. Respond to what the user said in a creative and helpful way, but keep your responses brief. Start by saying hello.", "role": "system", "name": "system"}, {"content": "Greet the user", "role": "assistant", "name": "assistant"}] 2024-08-26 17:13:16.276 | DEBUG | pipecat.processors.frame_processor:stop_ttfb_metrics:40 - LLM TTFB: 1.3974647521972656 2024-08-26 17:13:16.317 | DEBUG | pipecat.services.cartesia:run_tts:192 - Generating TTS: [Hello!] 2024-08-26 17:13:16.318 | DEBUG | pipecat.processors.frame_processor:stop_processing_metrics:56 - TTS processing time: 0.0007891654968261719 2024-08-26 17:13:16.374 | DEBUG | pipecat.services.cartesia:run_tts:192 - Generating TTS: [How can I assist you today?] 2024-08-26 17:13:16.374 | DEBUG | pipecat.processors.frame_processor:stop_processing_metrics:56 - TTS processing time: 0.00040912628173828125 2024-08-26 17:13:16.375 | DEBUG | pipecat.processors.frame_processor:stop_processing_metrics:56 - LLM processing time: 1.496906042098999 2024-08-26 17:13:16.688 | DEBUG | pipecat.processors.frame_processor:stop_ttfb_metrics:40 - TTS TTFB: 0.3707418441772461 {"timestamp":"2024-08-26T09:14:27.881769Z","level":"ERROR","fields":{"message":"send transport changed to failed"},"target":"daily_core::soup::sfu::mediasoup_manager"} {"timestamp":"2024-08-26T09:14:28.151822Z","level":"ERROR","fields":{"message":"recv transport changed to failed"},"target":"daily_core::soup::sfu::mediasoup_manager"}

where is the 01-local code that you used to run the backend?
i am trying to run the example here , which is similar but not getting any transcription.

@ttamoud
Copy link

ttamoud commented Sep 2, 2024

Check the bot-ready message forwarded by the Rtvi framework, thats probably whats causing the issue, frontend probably not receiveing the the signal correctly.

@sadimoodi
Copy link

Check the bot-ready message forwarded by the Rtvi framework, thats probably whats causing the issue, frontend probably not receiveing the the signal correctly.

can u point at the backend code required to run this frontend? i was looking here but the code doesnt seem to be correct.

@sadimoodi
Copy link

image

can u help by answering how did u get the bot status to be connected? it keeps showing us initializing, see below:
image

@themass
Copy link
Author

themass commented Sep 12, 2024

just config these env and run the demo server
https://github.com/rtvi-ai/rtvi-infra-examples/tree/main/01-local

@ttamoud
Copy link

ttamoud commented Sep 12, 2024

These are outdated, the rtvi standard has been updated few times since, not working anymore.
The rtvi standard backend ( rtvi-infra-examples ) used to send a bot-ready message that the frontend received and changed bot status from initializing to connected. The rtvi standard has been updated a few times since, but not the frontend and the infra-examples backend which causes it to brake.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants