Thanks for your contribution!
I want to train a new model in an another dataset. But I don't how much the gpu and the time will cost.
I tried to reproduce the training step in DIV2K so I use the code "python train.py --config configs/train_edsr-sronet.yaml --gpu 0,1,2". And I found that it would cost 70h in 3 NVDIA RTX 4090. But in your work that mentioned "All tests were conducted using a single NVIDIA RTX 3090.". I want to know if it is normal that training a edsr-baseline-SRNO model in DIV2K will cost about 70h in three 4090.
Thanks!
Thanks for your contribution!
I want to train a new model in an another dataset. But I don't how much the gpu and the time will cost.
I tried to reproduce the training step in DIV2K so I use the code "python train.py --config configs/train_edsr-sronet.yaml --gpu 0,1,2". And I found that it would cost 70h in 3 NVDIA RTX 4090. But in your work that mentioned "All tests were conducted using a single NVIDIA RTX 3090.". I want to know if it is normal that training a edsr-baseline-SRNO model in DIV2K will cost about 70h in three 4090.
Thanks!