Skip to content
Snippets Groups Projects
Commit e7e66886 authored by Danjou Pierre's avatar Danjou Pierre
Browse files

Update README.md

parent 1c3d45ef
No related branches found
No related tags found
No related merge requests found
...@@ -23,11 +23,25 @@ Here we can conclude that the best K is 5, (if we don't use k = 1) with a perfo ...@@ -23,11 +23,25 @@ Here we can conclude that the best K is 5, (if we don't use k = 1) with a perfo
### Maths ### Maths
![knn](mlp_maths/q1.png)
### Code ### Code
All the code can be found in the Python file mlp.py.
Below, you will find the graph of accuracy as a function of the number of epochs. We used a learning rate of 0.1 and a split ratio of 0.9 between the training and testing datasets.
![mlp](results/mlp.png)
Firstly, we observe that accuracy increases with each epoch.
However, after 100 epochs, the accuracy is around 16.2%, which is about half the accuracy achieved by the KNN method.
In conclusion, the MLP method is somewhat disappointing. It might be improved by increasing the number of epochs or adjusting the learning rate. I also observed that the MLP method was faster than the KNN method.
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment