You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.
I encountered a problem in the process of implementing swav on a small dataset. My batch size was only 16 maximum, but the number of my classes was 150. As your suggestion, the number of nmb prototypes was set to be one order of magnitude more than the number of classes. So my nmb prototypes are 1500. Without queues for the first 15 epochs, each batch contains 15 samples, which is not at all enough to distribute evenly among the 1,500 prototypes. So it's inevitable that there will be empty prototypes, is that normal? Should I ignore the empty prototypes and continue training?
The text was updated successfully, but these errors were encountered:
@mathildecaron31 Sorry to bother you, but have you ever encountered this problem? For example, if the batch size is much smaller than nmb prototypes, how to deal with empty prototypes?
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
I encountered a problem in the process of implementing swav on a small dataset. My batch size was only 16 maximum, but the number of my classes was 150. As your suggestion, the number of nmb prototypes was set to be one order of magnitude more than the number of classes. So my nmb prototypes are 1500. Without queues for the first 15 epochs, each batch contains 15 samples, which is not at all enough to distribute evenly among the 1,500 prototypes. So it's inevitable that there will be empty prototypes, is that normal? Should I ignore the empty prototypes and continue training?
The text was updated successfully, but these errors were encountered: