Skip to content

0.4.0 - Mixtral for qapairs, qapair prompt suite

Compare
Choose a tag to compare
@lukemarsden lukemarsden released this 29 Jan 07:05
· 2923 commits to main since this release
cd63737

What's Changed

Finetuning improvements

Helix now uses Mixtral on Together.ai instead of GPT-4 on OpenAI to generate the question-answer pairs in the dataprep stage of text finetuning. It also now uses a suite of qapair generation prompts, defined in qapairs_config.yaml. This is both a step towards fully on-prem qapair generation (we can run Mixtral locally) and results in better quality fine text tuning.

Upgrade notes

Be sure to add TOGETHER_API_KEY to an API key from together.ai instead of the OpenAI key in .env.

Full Changelog: 0.3.8...0.4.0