From 3f1b0e5cbf6210958f0a9cae1ea3c37a951aa57d Mon Sep 17 00:00:00 2001 From: Cristian Cerasuolo Date: Wed, 18 Feb 2026 16:26:07 +0100 Subject: [PATCH] Documentation about RaGAN and perpetual pretraining --- docs/architecture.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/architecture.md b/docs/architecture.md index 3d4f7fa..046e8fb 100644 --- a/docs/architecture.md +++ b/docs/architecture.md @@ -53,7 +53,7 @@ The generator zoo lives under `opensr_srgan/model/generators/` and can be select * **Stochastic GAN generator (`cgan_generator.py`).** Extends the flexible generator with conditioning inputs and latent noise, enabling experiments where auxiliary metadata influences the super-resolution output. * **ESRGAN generator (`esrgan.py`).** Implements the RRDBNet trunk introduced with ESRGAN, exposing `n_blocks`, `growth_channels`, - and `res_scale` so you can dial in deeper receptive fields and sharper textures. + and `res_scale` so you can dial in deeper receptive fields and sharper textures. The implementation supports original features like Relativistic Average GAN (RaGAN) and the codebase allows to perform two step training phase (content-oriented pretraining of generator followed by adversarial training with Discriminator) as originally proposed by ESRGAN authors. * **Advanced variants (`SRGAN_advanced.py`).** Provides additional block implementations and compatibility aliases exposed in `__init__.py` for backwards compatibility.