You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
I'm trying to run letta in docker connected to an ollama service running on the same host. I'm using an .env file with the following vars:
LETTA_LLM_ENDPOINT=http://192.168.xx.xx:11434
LETTA_LLM_ENDPOINT_TYPE=ollama
LETTA_LLM_MODEL=llama3.2:3b-instruct-q8_0
LETTA_LLM_CONTEXT_WINDOW=8192
LETTA_EMBEDDING_ENDPOINT=http://192.168.xx.xx:11434
LETTA_EMBEDDING_ENDPOINT_TYPE=ollama
LETTA_EMBEDDING_MODEL=mxbai-embed-large
LETTA_EMBEDDING_DIM=512
The server loads fine. I configure an Agent and Persona via the web interface. Then when attempting to start a chat with the agent I receive the following error on the server:
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/generate (Caused by NameResolutionError("<urllib3.connection.HTTPConnection object at 0x72f4fd942ba0>: Failed to resolve 'host.docker.internal' ([Errno -2] Name or service not known)"))
For some reason it's trying to use host.docker.internal even though the url is overridden in the .env file.
I'm also run inspect on the container when running and confirmed the correct env settings have been applied.
Please describe your setup
Installed with docker compose
Running on Linux (Ubuntu)
If you're not using OpenAI, please provide additional information on your local LLM setup:
Local LLM details
If you are trying to run Letta with local LLMs, please provide the following information:
Model: llama3.2:3b-instruct-q8_0
Backend: Ollama
OS: Ubuntu Linux VM (CPU only)
The text was updated successfully, but these errors were encountered:
Describe the bug
I'm trying to run letta in docker connected to an ollama service running on the same host. I'm using an .env file with the following vars:
LETTA_LLM_ENDPOINT=http://192.168.xx.xx:11434
LETTA_LLM_ENDPOINT_TYPE=ollama
LETTA_LLM_MODEL=llama3.2:3b-instruct-q8_0
LETTA_LLM_CONTEXT_WINDOW=8192
LETTA_EMBEDDING_ENDPOINT=http://192.168.xx.xx:11434
LETTA_EMBEDDING_ENDPOINT_TYPE=ollama
LETTA_EMBEDDING_MODEL=mxbai-embed-large
LETTA_EMBEDDING_DIM=512
The server loads fine. I configure an Agent and Persona via the web interface. Then when attempting to start a chat with the agent I receive the following error on the server:
For some reason it's trying to use host.docker.internal even though the url is overridden in the .env file.
I'm also run inspect on the container when running and confirmed the correct env settings have been applied.
Please describe your setup
If you're not using OpenAI, please provide additional information on your local LLM setup:
Local LLM details
If you are trying to run Letta with local LLMs, please provide the following information:
The text was updated successfully, but these errors were encountered: