Skip to content

The implementation of IncorporatedNAS. Published at Electronics

Notifications You must be signed in to change notification settings

SeoulTech-HCIRLab/IncorporatedNAS

Repository files navigation

Incorporated-NAS: An efficient Zero-shot proxy for light-weight NAS

This source code is the implementation of Incorporated-NAS. Our paper is published in the Electronics

Compare to Other Zero-Shot NAS Proxies on CIFAR-10/100

We use the ResNet-like search space and search for models within the parameter budget 1M. All models are searched by the same evolutionary strategy, trained on CIFAR-10/100 for 1440 epochs with auto-augmentation, cosine learning rate decay, weight decay 5e-4. We report the top-1 accuracies in the following table:

proxy CIFAR-10 CIFAR-100
Incorporated-NAS-l 96.66% 80.67%
Incorporated-NAS-s 96.86% 81.1%
Zen-NAS 96.2% 80.1%
FLOPs 93.1% 64.7%
grad-norm 92.8% 65.4%
synflow 95.1% 75.9%
TE-NAS 96.1% 77.2%
NASWOT 96.0% 77.5%
Random 93.5% 71.1%

Please check our paper for more details.

Reproduce Paper Experiments

System Requirements

  • PyTorch >= 1.5, Python >= 3.7
  • By default, ImageNet dataset is stored under ~/data/imagenet; CIFAR-10/CIFAR-100 is stored under ~/data/pytorch_cifar10 or ~/data/pytorch_cifar100

Searching on CIFAR-10/100

Searching for CIFAR-10/100 models with budget params < 1M , using different zero-shot proxies:

scripts/Combine_NAS_cifar_params1M.sh
scripts/Flops_NAS_cifar_params1M.sh
scripts/GradNorm_NAS_cifar_params1M.sh
scripts/NASWOT_NAS_cifar_params1M.sh
scripts/Params_NAS_cifar_params1M.sh
scripts/Random_NAS_cifar_params1M.sh
scripts/Syncflow_NAS_cifar_params1M.sh
scripts/TE_NAS_cifar_params1M.sh
scripts/Zen_NAS_cifar_params1M.sh

Searching on ImageNet

Searching for ImageNet models:

scripts/CombineNAS_ImageNet_flops400M.sh
scripts/CombineNAS_ImageNet_flops600M.sh

Pretrain models

All our pre-train EZenNet models here!

Open Source

A few files in this repository are modified from the following open-source implementations:

https://github.com/idstcv/ZenNAS
https://github.com/DeepVoltaire/AutoAugment/blob/master/autoaugment.py
https://github.com/VITA-Group/TENAS
https://github.com/SamsungLabs/zero-cost-nas
https://github.com/BayesWatch/nas-without-training
https://github.com/rwightman/gen-efficientnet-pytorch
https://pytorch.org/vision/0.8/_modules/torchvision/models/resnet.html

Citing

If you find this work useful, please cite the following paper:

@Article{electronics13163325,
AUTHOR = {Nguyen, Thi-Trang and Han, Ji-Hyeong},
TITLE = {Zero-Shot Proxy with Incorporated-Score for Lightweight Deep Neural Architecture Search},
JOURNAL = {Electronics},
VOLUME = {13},
YEAR = {2024},
NUMBER = {16},
ARTICLE-NUMBER = {3325},
URL = {https://www.mdpi.com/2079-9292/13/16/3325},
ISSN = {2079-9292},
DOI = {10.3390/electronics13163325}
}

About

The implementation of IncorporatedNAS. Published at Electronics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors