Skip to content

Training using DDP and SLURM #5566

@rave78

Description

@rave78

❓ Questions and Help

What is your question?

The current scenario is two nodes with different free GPUs. For instance, node1 has 5 free gpus and node2 has 3 free gpus. I can requested the 8 free gpus using slurm without care the number of nodes. Is there any way that I can use PL for using the 8 available gpus in this context?. I read the documentation and it looks that one constraint is to have always the same number of free gpus on each node.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions