Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

A question about batch_size and nmb prototypes #118

Open
mrFocusXin opened this issue Jul 17, 2023 · 1 comment
Open

A question about batch_size and nmb prototypes #118

mrFocusXin opened this issue Jul 17, 2023 · 1 comment

Comments

@mrFocusXin
Copy link

I encountered a problem in the process of implementing swav on a small dataset. My batch size was only 16 maximum, but the number of my classes was 150. As your suggestion, the number of nmb prototypes was set to be one order of magnitude more than the number of classes. So my nmb prototypes are 1500. Without queues for the first 15 epochs, each batch contains 15 samples, which is not at all enough to distribute evenly among the 1,500 prototypes. So it's inevitable that there will be empty prototypes, is that normal? Should I ignore the empty prototypes and continue training?

@mrFocusXin
Copy link
Author

@mathildecaron31 Sorry to bother you, but have you ever encountered this problem? For example, if the batch size is much smaller than nmb prototypes, how to deal with empty prototypes?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant