You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,
Thank you for providing these valuable recipes. I appreciate your work.
I'm interested in further pre-training the Llama3.1-8B-base model rather than using the instruct version. To ensure I prepare my data correctly, I'd like some clarification on the tokenization process:
Could you please provide information about how the data should be tokenized?
Specifically, I'm wondering whether the tokenized sequences should include the EOS and BOS tokens:
Both the (BOS) and (EOS) tokens
Only one of these special tokens (if so, which one?)
Thank you in advance for your assistance.
The text was updated successfully, but these errors were encountered:
@init27
Thank you for your response. I've reviewed the information provided about the special tokens:
<|begin_of_text|>: Specifies the start of the prompt
<|end_of_text|>: Indicates the model should cease generating more tokens (generated only by base models)
I understand that the EOS token is used during pretraining the base model. However, I'm unclear about the BOS token's usage, particularly in the pretraining phase. Since it's defined as "the start of the prompt," I'm wondering is the BOS token used during pretraining, or is it primarily for fine-tuning and inference?
So which one should I prepare my pretraining data:
Hello,
Thank you for providing these valuable recipes. I appreciate your work.
I'm interested in further pre-training the Llama3.1-8B-base model rather than using the instruct version. To ensure I prepare my data correctly, I'd like some clarification on the tokenization process:
Could you please provide information about how the data should be tokenized?
Specifically, I'm wondering whether the tokenized sequences should include the EOS and BOS tokens:
Thank you in advance for your assistance.
The text was updated successfully, but these errors were encountered: