Skip to content

Commit 886f8cb

Browse files
committed
Remove is_first_microbatch setting after warmup
1 parent b345941 commit 886f8cb

1 file changed

Lines changed: 0 additions & 5 deletions

File tree

transformer_engine/pytorch/graph.py

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -507,11 +507,6 @@ def hook_fn(
507507
else:
508508
grad_inputs = None
509509
del outputs, grad_inputs
510-
# The following code is added specifically for MCore's special requirements,
511-
# aimed at preventing warmup from altering the control flow.
512-
for module in func.modules():
513-
if hasattr(module, "is_first_microbatch"):
514-
module.is_first_microbatch = True
515510
torch.cuda.synchronize()
516511

517512
# All captures here share a mempool. To avoid replays corrupting each other's memory,

0 commit comments

Comments
 (0)