Neural network generating hand written numbers
Custom model trained with custom data loader
- Fully convolutional, no Linear (Fully contected) layers
- Generator with 6x TransposeConv, batchnorm, ReLu activation
- Discriminator with 6x Conv2d, batchnorm, Leaky ReLu activation with 0.2 negative_slope
- Orthogonal weight init with 0.1 gain
- Zero bias init
- Trained on NVIDIA RTX 3060
Instead of the usual BCE function, I used BCEWithLogitsLoss for better numerical stability.
Loss formula -
From this figure we can see that 50 epoch is enough for these models. Generator can not get any better.
For better results you should try wasserstein loss for example. But i think for this aplication this is enough
Fully generated hand digits from Generator neural network
Download dataset here -
Dataset
In run_training.py
change path to dataset.
Run run_training.py
.