Skip to content

Training Instability with IQL: Loss Explodes After Initial Epochs #175

@shhmxy2

Description

@shhmxy2

Hi - Thanks for the great work! I'm experiencing training instability when running IQL on my custom dataset (~8000 samples). The loss decreases normally for the first 2-3 epochs but then suddenly increases and diverges. I've tried reducing learning rate from 3e-4 to 3e-5, but it only delays the issue. I wonder if anyone else has encountered similar instability? Any suggestions for hyperparameter tuning or potential implementation issues to check? Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions