Skip to content
Snippets Groups Projects
Commit a13e0c64 authored by Danjou Pierre's avatar Danjou Pierre
Browse files
parents c8dd107a 439e7698
No related branches found
No related tags found
No related merge requests found
...@@ -24,10 +24,60 @@ Here we can conclude that the best K is 5, (if we don't use k = 1) with a perfo ...@@ -24,10 +24,60 @@ Here we can conclude that the best K is 5, (if we don't use k = 1) with a perfo
### Maths ### Maths
![knn](mlp_maths/q1.PNG)
![knn](mlp_maths/q2.PNG)
![knn](mlp_maths/q3.PNG)
![knn](mlp_maths/q4.PNG)
![knn](mlp_maths/q5.PNG)
![knn](mlp_maths/q6.PNG)
![knn](mlp_maths/q7.PNG)
![knn](mlp_maths/q8.PNG)
![knn](mlp_maths/q9.PNG)
### Code ### Code
All the code can be found in the Python file mlp.py.
Below, you will find the graph of accuracy as a function of the number of epochs. We used a learning rate of 0.1 and a split ratio of 0.9 between the training and testing datasets.
![mlp](results/mlp.png)
Firstly, we observe that accuracy increases with each epoch.
However, after 100 epochs, the accuracy is around 16.2%, which is about half the accuracy achieved by the KNN method.
In conclusion, the MLP method is somewhat disappointing. It might be improved by increasing the number of epochs or adjusting the learning rate. I also observed that the MLP method was faster than the KNN method.
mlp.png

29.6 KiB

mlp_maths/q1.PNG

1.39 KiB

mlp_maths/q2.PNG

3.24 KiB

mlp_maths/q3.PNG

4.51 KiB

mlp_maths/q4.PNG

3.35 KiB

mlp_maths/q5.PNG

2.72 KiB

mlp_maths/q6.PNG

3.48 KiB

mlp_maths/q7.PNG

4.25 KiB

mlp_maths/q8.PNG

3.2 KiB

mlp_maths/q9.PNG

2.58 KiB

0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment