0.4.0 - Mixtral for qapairs, qapair prompt suite
What's Changed
- Fix finetuning uploads by @rusenask in #125
- Finetuning improvements by @lukemarsden in #129
Finetuning improvements
Helix now uses Mixtral on Together.ai instead of GPT-4 on OpenAI to generate the question-answer pairs in the dataprep stage of text finetuning. It also now uses a suite of qapair generation prompts, defined in qapairs_config.yaml
. This is both a step towards fully on-prem qapair generation (we can run Mixtral locally) and results in better quality fine text tuning.
Upgrade notes
Be sure to add TOGETHER_API_KEY
to an API key from together.ai instead of the OpenAI key in .env
.
Full Changelog: 0.3.8...0.4.0