This BE aims to implement a Deep Convolutional Generative Adversarial Network (DC-GAN) and a Conditional Generative Adversarial Network (cGAN).
## Description
### Part 1: DC-GAN
In the first part of this BE, I implemented a GAN to generate images of handwritten digits from MNIST dastaset.
The GAN is composed of a generator and a discriminator. The generator takes as input a random noise vector and outputs an image of a handwritten digit. The discriminator takes as input an image of a handwritten digit and outputs a probability that the image is real (from the dataset) or fake ( generated by the generator).
The generator and the discriminator are trained simultaneously. The generator tries to generate images that are as realistic as possible to fool the discriminator, while the discriminator tries to distinguish between real and fake images.
It has been done by following the steps from the [DCGAN Tutorial](https://pytorch.org/tutorials/beginner/dcgan_faces_tutorial.html).
### Part 2: cGAN
In the second part of this BE, I completed the notebook to implement a Conditional Generative Adversarial Network (cGAN) to generate images of building facades from the CMP facade database, from a mask of the facade.
The cGAN is composed of a generator and a discriminator. The generator takes as input a random noise vector and a mask of the facade and outputs an image of the facade. The discriminator takes as input an image of the facade and a mask of the facade and outputs a probability that the image is real or fake, and that the image corresponds to the mask.
The generator architecture is a U-Net, which is a convolutional neural network that has a contracting path (encoder) and an expanding path (decoder). It allows to generate images of the same size as the input image (the mask of the facade).
Then we trained the model over 200 epochs. The results one a few samples are displayed in the notebook.
We can see that while the images transformed into masks, and then transformed back into images are very similar to the original images, the generated facades from the validation datasets does not look like the real facades. It's probably due to the low number of epochs.
<!-- MSO 3.4 Apprentissage Automatique
---
...
...
@@ -8,4 +34,4 @@ We recommand to use the notebook (.ipynb) but the Python script (.py) is also pr
# How to submit your Work ?
This work must be done individually. The expected output is a private repository named gan-cgan on https://gitlab.ec-lyon.fr. It must contain your notebook (or python files) and a README.md file that explains briefly the successive steps of the project. Don't forget to add your teacher as developer member of the project. The last commit is due before 11:59 pm on Monday, April 1st, 2024. Subsequent commits will not be considered.
This work must be done individually. The expected output is a private repository named gan-cgan on https://gitlab.ec-lyon.fr. It must contain your notebook (or python files) and a README.md file that explains briefly the successive steps of the project. Don't forget to add your teacher as developer member of the project. The last commit is due before 11:59 pm on Monday, April 1st, 2024. Subsequent commits will not be considered. -->