This repo is for all our experiments related to the CIL project at ETH. It's structured to make it easy to plug in new models, configs, datasets, etc.
Group name: Whizzes
Participants: Vladislav Lomtev, Lucija Tonkovic, Jing Yan, Davit Melikidze
The depth maps generated by the SOTA models that are used for some of the experiments can be found on the following link (if one does not wish to generate them from scratch - see last section): https://drive.google.com/drive/folders/1m9JJAApyQgykQWFnr9w2b-nBkFtFhPMi?usp=drive_link
The uncertainty maps are provided on the following link (again instructions on how to generate them is in the last section): https://polybox.ethz.ch/index.php/s/YdgKkoiwaKrAGn9
Model checkpoints are saved and available on the following link: https://drive.google.com/drive/folders/126Wf5q8NU6Y8bsLAfVvv6AScB_osnbwL?usp=sharing
Each experiment described in the project report has its own branch, which is named by the models respective name. Each branch contains the following folders:
Contains a file with the training setup:
- Hyperparameters (learning rate, batch size, etc.)
- Augmentations (if they were tested on that model in some runs)
- Model/optimizer/loss initialization
PyTorch Dataset class for defining (train_x, label_y) pairs for most models, or loading depth maps and uncertainty maps as well for the models also using the fusion module.
Model architectures:
- Both custom and baseline versions
- Naming convention:
- <model_name>.py → main architecture
- <model_name>_utils.py → extra blocks if needed
Includes the train_utils.py with train/validation/test loops used in notebooks to run the experiments.
Contains the notebooks where the experiments were run:
- Import configs, split data, train/evaluate models
- Make sure to set the correct data paths (where train data and test data is stored, and for some models the pretrained depth maps and uncertainty maps) and preferred GPU in configs or notebooks.
- Each branch has a requirements.txt file. Simply run pip install -r requirements.txt before running anything.
- The notebook in every respective branch has the necessary code to load the needed classes and functions to run experiments/training.
This branch contains the necessary code for the experimentation regarding creating a new feature map by using depth maps from the mentioned SOTA models in the report. It also contains the code used to generate the mentioned uncertainty maps. It has a new PyTorch Dataset class (CombDepthDataset) which makes it possible to also load these additional maps. If one wishes to run experiments with these additional maps, the depth maps and uncertainty maps need to be generated. The links to get all the maps are provided in the very top section of this ReadMe file.
However, if one wishes to generate them themselves the depth maps can be obtained by cloning the respective repository for each SOTA model, installing the requirements, and then copying and running respective scripts given in the run_sota_models folder in the fusion_branch. Remember to change the file paths in the scripts regarding where the rgb images are located, as well as the path regarding where to save the depth maps. To generate the uncertainty maps, the code is given in the gen_uncertainty_map folder together with a readm me file with a description (again in the fusion_branch).
The respective links to the repositories of all the SOTA models are given below:
- ZoeDepth: https://github.com/isl-org/ZoeDepth
- DepthAnything V2: https://github.com/DepthAnything/Depth-Anything-V2
- UniDepth: https://github.com/lpiccinelli-eth/UniDepth
- DistillAnyDepth: https://github.com/Westlake-AGI-Lab/Distill-Any-Depth