Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Broken LMStudio integration #1835

Open
sarahwooders opened this issue Oct 7, 2024 · 0 comments
Open

Broken LMStudio integration #1835

sarahwooders opened this issue Oct 7, 2024 · 0 comments
Assignees
Labels
bug Something isn't working

Comments

@sarahwooders
Copy link
Collaborator

From discord:


pydantic_core._pydantic_core.ValidationError: 1 validation error for LLMConfig
model
  Input should be a valid string [type=string_type, input_value=None, input_type=NoneType]
    For further information visit https://errors.pydantic.dev/2.9/v/string_type

i am using LM Studio and my configuratuion is this:

Select LLM inference provider: local
? Select LLM backend (select 'openai' if you have an OpenAI compatible proxy): lmstudio
? Enter default endpoint: http://localhost:1234/v1
? Is your LLM endpoint authenticated? (default no) No
? Select default model wrapper (recommended: chatml): chatml
? Select your model's context window (for Mistral 7B models, this is probably 8k / 8192): 8192
? Select embedding provider: local
? Select storage backend for archival data: chroma
? Select chroma backend: persistent
? Select storage backend for recall data: sqlite
@sarahwooders sarahwooders added the bug Something isn't working label Oct 7, 2024
@sarahwooders sarahwooders self-assigned this Oct 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Status: To triage
Development

No branches or pull requests

1 participant