diff --git a/TD2 Deep Learning.ipynb b/TD2 Deep Learning.ipynb index 0b87d715e6550fc71fc42ea6db1aec92191091b2..9a0cf1b7bff6c53e8654d192eb7149cf0c105163 100644 --- a/TD2 Deep Learning.ipynb +++ b/TD2 Deep Learning.ipynb @@ -2278,14 +2278,6 @@ " Now, let's modify the code to replace the current classification layer with a set of two layers using a \"relu\" activation function for the middle layer and the \"dropout\" mechanism for both layers. Afterward, we will rerun the experiments and examine the obtained results." ] }, - { - "cell_type": "markdown", - "id": "347d6172", - "metadata": {}, - "source": [ - "#to modify : After implementing this change, we observe a slight decrease in training accuracy to 94%, compared to the 97% achieved in the previous model. This suggests that the initial classification layer may be a better fit for this task." - ] - }, { "cell_type": "code", "execution_count": 151,