From a4eb6915e2379bd158888a2a8be444c8dde9eeed Mon Sep 17 00:00:00 2001
From: "MSI\\alber" <alberto.cavallo@synesthesia.it>
Date: Fri, 10 Nov 2023 16:52:48 +0100
Subject: [PATCH] Final commit

---
 README.md | 19 ++++++++++++++-----
 1 file changed, 14 insertions(+), 5 deletions(-)

diff --git a/README.md b/README.md
index 6a4fb38..66b9d02 100644
--- a/README.md
+++ b/README.md
@@ -55,15 +55,24 @@ m
 
 5. In the `results` folder there are three plot images:
    - `knn.png`: refers to the knn algorithm, it represents the plot of the accuracy evolution along increasing value of 'k' (from 1 to 20)
-   <div style="text-align:center;">
-      <img src="results/knn.png" alt="knn" width="300" height="200">
-   </div>
+
+   The accuracy returned by the KNN is around 35%, which is a low average result, but which we expected since the KNN is based solely on finding the image that has the most similar, and therefore closest, pixels in terms of Euclidean distance, not a very good method for image classification.
+As can be seen from the graph, the decreasing trend in the graph is due to the fact that by taking only a few images there is little margin of comparison to assign the class, either the image selected represents the correct class or else the classification has failed. On the other hand, taking too many images carries the risk of selecting images that are not representative of the test image in question.
+  <div style="text-align:center;">
+     <img src="results/knn.png" alt="knn" width="300" height="200">
+  </div>
    
-   - `mlp.png`: refers to the MLP neural network, it represents the plot of the training accuracies evolution along 100 epochs 
-   - `loss.png`: refers to the MLP neural network, it represents the plot of the loss evolution along 100 epochs (further proof that the network works)
+  - `mlp.png`: refers to the MLP neural network, it represents the plot of the training accuracies evolution along 100 epochs 
 
+The accuracy returned by the model has an increasing trend starting at about 10 %, which is understandable given the presence of 10 classes and thus the network let's say that at the beginning it tries to guess the class, while as the epochs advance and the layer weights and biases are updated step by step, we notice an improvement up to 18 % of the 100th epoch, which given the low complexity of our network is an acceptable result.
 <div style="text-align:center;">
   <img src="results/mlp.png" alt="mlp" width="300" height="200">
+</div>
+  
+- `loss.png`: refers to the MLP neural network, it represents the plot of the loss evolution along 100 epochs (further proof that the network works)
+
+Another way to see if our network is training is to look at the trend of the loss, which having a decreasing trend confirms what we said before.
+<div style="text-align:center;">
   <img src="results/loss.png" alt="loss" width="300" height="200">
 </div>
 
-- 
GitLab