Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]: "Generic OpenAI" LLM Provider can not be edited #2016

Open
cougarten opened this issue Aug 1, 2024 · 2 comments · May be fixed by #2397
Open

[BUG]: "Generic OpenAI" LLM Provider can not be edited #2016

cougarten opened this issue Aug 1, 2024 · 2 comments · May be fixed by #2397
Assignees
Labels
blocked investigating Core team or maintainer will or is currently looking into this issue possible bug Bug was reported but is not confirmed or is unable to be replicated.

Comments

@cougarten
Copy link

How are you running AnythingLLM?

Docker (remote machine)

What happened?

Selecting "Generic OpenAI" in a workspace for the first time shows all the settings/input fields to set it up. Once confirmed it is impossible to get back to this settings screen to change things.

Ugly workarround: delete the config for "Generic OpenAI" manually.

Are there known steps to reproduce?

Steps to reproduce:

  • start with an installation that has "Generic OpenAI" LLM Provider set
  • create a new workspace
  • go to settings -> "chat settings", of that workspace
  • select "Generic OpenAI" as the LLM Provider
  • enter any details so it allows you to save (does not have to work)
  • (optionally: close the settings, try using the workspace)
  • go to settings -> "chat settings" again
  • (optionally: choose any other LLM Provider)
  • choose "Generic OpenAI" as teh LLM Provider again

Expected Outcome:

  • The initial pop-over dialog appears again
  • I see the details I previously entered
  • I can change the connection settings

Actual outcome:

  • I can select "Generic OpenAI", but no pop-over dialog appears
  • even new workspaces don't show it
@cougarten cougarten added the possible bug Bug was reported but is not confirmed or is unable to be replicated. label Aug 1, 2024
@timothycarambat timothycarambat added the investigating Core team or maintainer will or is currently looking into this issue label Aug 1, 2024
@linxiaohui
Copy link

+1

@shatfield4
Copy link
Collaborator

@cougarten @linxiaohui Can you please explain to me what your use case is for this feature? Implementing this will take significant backend changes because the way we have providers setup currently, you cannot change things like the base URL, API key, context windows, or max tokens easily. We can however add a model field to this provider.

Is your goal to use multiple different providers using the Generic OpenAI provider in the workspace? If so please let us know which providers you are using and we can add these as additional providers so you don't have to use the Generic OpenAI provider to do this.

If your goal is to use the same provider but with different models I will be adding an input field to this provider.

@shatfield4 shatfield4 linked a pull request Sep 30, 2024 that will close this issue
10 tasks
@shatfield4 shatfield4 linked a pull request Sep 30, 2024 that will close this issue
10 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
blocked investigating Core team or maintainer will or is currently looking into this issue possible bug Bug was reported but is not confirmed or is unable to be replicated.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants