diff --git a/TD2_Deep_Learning.ipynb b/TD2_Deep_Learning.ipynb index defd5b05c2a5cbd85ac4bccce6df4a42921fba6d..29ef64ca6e00520af8ad33061400e51ce95613ad 100644 --- a/TD2_Deep_Learning.ipynb +++ b/TD2_Deep_Learning.ipynb @@ -2630,6 +2630,13 @@ "**Second network based on Resnet18 CNN model, with a classification layer composed of two layers and dimension of hidden layer is 256. We use \"dropout mechanism for both layers**" ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Dropout is a regularization technique commonly employed in Convolutional Neural Network (CNN) models to prevent overfitting. During training, random neurons or units within the network are temporarily \"dropped out\" or deactivated with a certain probability. This forces the network to learn more robust features and reduces its reliance on specific neurons, enhancing generalization to new data." + ] + }, { "cell_type": "code", "execution_count": 41,