Skip to content

Conversation

@wouterzwerink
Copy link

@wouterzwerink wouterzwerink commented Jul 10, 2023

Supports distributed training with QUASI_RANDOM traversal order.
Relies on the same code as non-distributed to generate the order. Then, divides this into num_ranks contiguous chunks. The last two ranks can have some overlap to ensure that all ranks have the same amount of data.

@andrewilyas
Copy link
Contributor

Thanks so much for this PR, sorry for the late reply! Did you try training anything with this sampler/were the results okay? If so, I'll merge it in to v1.1.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants