diff --git a/README.md b/README.md
index 6d6383dcee77f8cf092b4b7a98f8fb176ce9deb1..f4ea8afb55eb9872bfc6afe17ff55c29e013f701 100644
--- a/README.md
+++ b/README.md
@@ -23,11 +23,25 @@ Here we can conclude that the best K is 5, (if we don't use k = 1)  with a perfo
 
 ### Maths 
 1° 
-
+![knn](mlp_maths/q1.png)
 
 ​
 
 ### Code
 
+All the code can be found in the Python file mlp.py.
+
+Below, you will find the graph of accuracy as a function of the number of epochs. We used a learning rate of 0.1 and a split ratio of 0.9 between the training and testing datasets.
+
+![mlp](results/mlp.png)
+
+
+
+Firstly, we observe that accuracy increases with each epoch.
+
+However, after 100 epochs, the accuracy is around 16.2%, which is about half the accuracy achieved by the KNN method.
+
+In conclusion, the MLP method is somewhat disappointing. It might be improved by increasing the number of epochs or adjusting the learning rate. I also observed that the MLP method was faster than the KNN method.
+