Skip to content
This repository has been archived by the owner on Feb 17, 2024. It is now read-only.

mT5-Small is taking large amount of RAM while preprocessing. #43

Open
Mohd-Ali-Ansari opened this issue Dec 28, 2020 · 0 comments
Open

Comments

@Mohd-Ali-Ansari
Copy link

I am using mt5-small for machine translation task using pytorch and transformer. I have approx 3 million parallel data and using 96GB RAM and 1 P-100 GPU for fine-tuning, but not able to fine tuning because RAM are fully exhausted before training started.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant