Skip to content
This repository was archived by the owner on Aug 1, 2024. It is now read-only.

Support for GPUs that can't use bfloat16 and non-distributed training#28

Open
JulesCollenne wants to merge 1 commit intofacebookresearch:mainfrom
JulesCollenne:main
Open

Support for GPUs that can't use bfloat16 and non-distributed training#28
JulesCollenne wants to merge 1 commit intofacebookresearch:mainfrom
JulesCollenne:main

Conversation

@JulesCollenne
Copy link

Parts of the code still used bfloat16 even when "use_bfloat" was False.
Added a condition to check if distributed learning is possible or not depending on the world size.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jun 22, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants