Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat/fireworks integration #2089

Open
wants to merge 21 commits into
base: main
Choose a base branch
from

Conversation

somashekhar161
Copy link

@somashekhar161 somashekhar161 commented Sep 20, 2024

Description

issue : #1945

FEAT : added fireworks integration

used template used for openai

new dependencies :
llama-index-llms-fireworks = {version = "^0.1.5", optional = false}
llama-index-embeddings-fireworks = {version = "^0.1.2", optional = false}

HOW TO RUN :
install dependencies
poetry install --extras "ui llms-fireworks embeddings-fireworks vector-stores-qdrant embeddings-openai"

set api key:
set FIREWORKS_API_KEY=YOUR_FIREWORKS_API_KEY

set profile:
set PGPT_PROFILES=fireworks

run:
make run

Type of Change

Please delete options that are not relevant.

  • [fixed UI where input overflow at14 inch laptop screen ] Bug fix (non-breaking change which fixes an issue)
  • [ added fireworks integration ] New feature (non-breaking change which adds functionality)
  • [dependency update : poetry install --extras "ui llms-fireworks embeddings-fireworks vector-stores-qdrant embeddings-openai" ] This change requires a documentation update
  • [upated documentaion for the fireworks installation]

How Has This Been Tested?

Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration

  • I stared at the code and made sure it makes sense

Test Configuration:

  • Firmware version: 0.6.0
  • Hardware: windows 10 22H2 laptop
  • Toolchain: NONE
  • SDK: NONE

Checklist:

  • [ 🆗 ] My code follows the style guidelines of this project
  • [🆗 ] I have performed a self-review of my code
  • [🆗 ] I have commented my code, particularly in hard-to-understand areas
  • [ 🆗 ] I have made corresponding changes to the documentation
  • [🆗 ] My changes generate no new warnings

image

Copy link
Collaborator

@jaluma jaluma left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you very much for the input, very useful. I've left a couple of things for you to change. Also, I see that the documentation update with Fireworks is missing :)

private_gpt/settings/settings.py Outdated Show resolved Hide resolved
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would say that the Docker file is useless in this case. It's a better idea if someone wants to use fireworks, make the necessary modifications.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes the docker file is only for running it even if the make modification docker file is usefull cause of dependency errors
python version mismatch etc

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't like this premise.
Adding another Docker/Docker-compose profile, will mean more code to maintain, when it is not the optimal PGPT user case. If fireworks was a fully local-setup environment, probably, it would be nice.

- .env
profiles:
- fireworks
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as Dockerfile

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

here i have used how other docker files and docker-compose are written
followed those template so that it won't be outsider :)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would say the same. It is not the core of PGPT, therefore, we should not give this kind of support. Our goal is to give a 100% private solution, in this area, our two main providers are Ollama and Llama-CPP. Of course, this PR gives more value to other people with the same problems, but it doesn't make sense to maintain on docker-compose and Dockerfile.

@somashekhar161
Copy link
Author

somashekhar161 commented Sep 23, 2024

Thank you very much for the input, very useful. I've left a couple of things for you to change. Also, I see that the documentation update with Fireworks is missing :)

yeah i was searching for that to update , and for time being added those in docker and this above PR

Copy link
Contributor

Copy link
Contributor

Copy link
Contributor

Copy link
Contributor

Copy link
Contributor

Copy link
Contributor

Copy link
Contributor

Copy link
Contributor

Copy link
Contributor

Copy link
Contributor

Copy link
Contributor

Copy link
Contributor

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Before that recommended setups, there are a couple tables with all available options.

poetry.lock Outdated
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure how the new dependencies are installed, but it bumps all dependencies. Please revert the current poetry.lock and update it using poetry lock --no-update. Many of the changes you have are related to bump versions. Probably, you will have to revert some formatting changes as well. Anyway, I will try to update Llama-index to the latest version, including all other dependencies. If you prefer to wait and merge it when it is ready, I can send you a ping.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i had bumped it while i was having error when testing at github
will revert back to old version :)

Copy link
Contributor

Copy link
Contributor

| openai | Adds support for OpenAI LLM, requires OpenAI API key | llms-openai |
| openailike | Adds support for 3rd party LLM providers compatible with OpenAI's API | llms-openai-like |
| azopenai | Adds support for Azure OpenAI LLM, requires Azure endpoints | llms-azopenai |
| gemini | Adds support for Gemini LLM, requires Gemini API key | llms-gemini |

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why didn't you add fireworks here?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i have added fireworks installation steps near open-ai

| openai | Adds support for OpenAI Embeddings, requires OpenAI API key | embeddings-openai |
| sagemaker | Adds support for Amazon Sagemaker Embeddings, requires Sagemaker endpoints | embeddings-sagemaker |
| azopenai | Adds support for Azure OpenAI Embeddings, requires Azure endpoints | embeddings-azopenai |
| gemini | Adds support for Gemini Embeddings, requires Gemini API key | embeddings-gemini |

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why didn't you add fireworks here?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i have added fireworks installation steps near open-ai

- .env
profiles:
- fireworks
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would say the same. It is not the core of PGPT, therefore, we should not give this kind of support. Our goal is to give a 100% private solution, in this area, our two main providers are Ollama and Llama-CPP. Of course, this PR gives more value to other people with the same problems, but it doesn't make sense to maintain on docker-compose and Dockerfile.

@somashekhar161
Copy link
Author

context :I would say the same. It is not the core of PGPT, therefore, we should not give this kind of support. Our goal is to give a 100% private solution, in this area, our two main providers are Ollama and Llama-CPP. Of course, this PR gives more value to other people with the same problems, but it doesn't make sense to maintain on docker-compose and Dockerfile.

REPLAY : docker file exist to make life easier
and i had issue with python versions

@muzammilvd
Copy link

Is this PR merge anytime soon?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants