Skip to content

Latest commit

 

History

History
56 lines (36 loc) · 1.32 KB

File metadata and controls

56 lines (36 loc) · 1.32 KB

📝 Fine-Tuning Convolutional Neural Networks (CNNs)


✅ Step-by-Step Checklist

1. Setup

  • Load a pretrained CNN backbone (e.g., ResNet50 pretrained on ImageNet).

  • Replace the classification head with a new fully connected layer:

    • nn.Linear(2048, num_classes) for ResNet50.
  • Freeze all layers except the classification head.


2. Data Preparation

  • Resize inputs to the resolution expected by the pretrained model (e.g., 384×384).

  • Normalize images using pretrained stats (ImageNet mean & std).

  • Apply augmentation:

    • Basic: random crop, horizontal flip, small rotations.

3. Hyperparameters

  • Learning rate:
  • Batch size:
  • Epochs:
  • Optimizer:
    • AdamW (fast convergence, but watch for overfitting).
  • Weight decay: 1e-4 (regularization).

4. Learning Rate Strategy

  • Warm-up: start small (1e-6) and ramp to base LR over 5–10 epochs.
  • Decay:
    • Step decay (×0.1 every 30 epochs).
  • Scheduler:
    • StepLR, CosineAnnealingLR, or OneCycleLR.

5. Overfitting Prevention

  • Use data augmentation to increase variability.
  • Add dropout (0.2–0.5) in classifier head.
  • Apply early stopping based on validation loss.