This project implements classical Neural Style Transfer (Gatys et al.), where a generated image is directly optimized to preserve the content of one image and the style of another using a fixed pretrained VGG network.
Unlike fast style transfer approaches, this method does not train a new model.
The output image itself is optimized via gradient descent.
- Uses VGG19 pretrained on ImageNet as a fixed feature extractor
- Content is preserved using high-level feature representations
- Style is represented using Gram matrices
- The generated image is optimized to minimize:
No model weights are trained or saved.
The final stylized image is the result of the optimization process.
image-style-transfer/
├── notebooks/
│ └── 01_exploration.ipynb
├── src/
│ ├── utils.py
│ ├── model.py
│ ├── loss.py
│ ├── run_style_transfer.py
├── experiments/
│ └── exp_01/
│ └── config.yaml
│ └── notes.md
├── outputs/
│ └── images/
├── scripts/
│ └── run_exp.sh
├── requirements.txt
└── README.md
- Python 3.9+
- PyTorch
Installing PyTorch may take several minutes.
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txtpython3 src/run_style_transfer.py config.yaml





