Skip to content

How to reproduce DailyMAE & evaluate properly #1

@ultranity

Description

@ultranity

Glad to meet such a codebase with customized ffcv support after ffcv stop update for about 10 months :) Some questions here:

  • is it correct to reproduce DailyMAE ESSL result by
torchrun --nproc_per_node 8 main_pretrain.py  --data_path=${train_path} --data_set=ffcv \
    --epochs 800 --warmup_epochs 40 --blr 1.5e-4 --weight_decay 0.05 --batch_size 512\
    --cfgs configs/mae_ffcv.gin configs/dres.gin --gin build_model.model_fn=@base/MaskedAutoencoderViT build_dataset.transform_fn=@SimplePipeline  --ckpt_freq=100 --output_dir outputs/IN1K_base_ffcv_dres

?

  • how to evaulate the experiment results using this codebase? there is a finetuning.py and vitrun command from the small-dataset.md file but seems to be missing.

other issues:

  • why not merge main_pretrain_ema.py into main_pretrain.py?

error report:

  • submitit_pretrain.py:L59 miss a aug_parse function
  • mae_ffcv.gin error No configurable matching 'SimplePipeline'. seems there is missing import in dataset/build_dataset.py

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions