Skip to content
Snippets Groups Projects
Commit c83464da authored by corentin's avatar corentin
Browse files

Add of graphs

parent 11ad7e22
Branches
No related tags found
No related merge requests found
......@@ -126,7 +126,7 @@ Here is the graph of the accuracy vs K for the whole Cifar dataset with a split
![Image](results/knn.png)
Here we can conclude that the best K is 9, (if we don't use k = 1) with a performace of 35% of accuracy.
Here we can conclude that the best K is 9, (if we don't use k = 1) with a performace of 35% of accuracy. (I tried to updtade the graph, but my kernel kept dying at each run so I kept the first version that could execute.)
## Artificial Neural Network
......@@ -326,12 +326,11 @@ def run_mlp_training(data_train, labels_train, data_test, labels_test, d_h,learn
d_in = data_train.shape[1]
d_out = 10 #we can hard code it here or len(np.unique(label_train))
#Random initialisation of weights
w1 = np.random.randn(d_in, d_h)
b1 = np.random.randn(1, d_h)
w2 = np.random.randn(d_h, d_out)
b2 = np.random.randn(1, d_out)
#Random initialisation of weights (Xavier initialisation)
w1 = np.random.randn(d_in, d_h) / np.sqrt(d_in)
b1 = np.zeros((1, d_h))
w2 = np.random.randn(d_h, d_out) / np.sqrt(d_h)
b2 = np.zeros((1, d_out))
# Train MLP
w1, b1, w2, b2, train_accuracies = train_mlp(w1, b1, w2, b2, data_train, labels_train, learning_rate, num_epoch)
......@@ -339,4 +338,29 @@ def run_mlp_training(data_train, labels_train, data_test, labels_test, d_h,learn
# Test MLP
test_accuracy = test_mlp(w1, b1, w2, b2, data_test, labels_test)
return train_accuracies, test_accuracy
```
\ No newline at end of file
```
#### 16-
```rb
def plot_graph(data_train, labels_train, data_test, labels_test, d_h, learning_rate, num_epoch):
# Run MLP training
train_accuracies, test_accuracy = run_mlp_training(data_train, labels_train, data_test, labels_test, d_h, learning_rate, num_epoch)
# Plot and save the learning accuracy graph
plt.figure(figsize=(8, 6))
epochs = np.arange(1, num_epoch + 1)
plt.plot(epochs, train_accuracies, marker='x', color='b', label='Train Accuracy')
plt.xlabel('Epochs')
plt.ylabel('Accuracy')
plt.title('MLP Train Accuracy')
plt.legend()
plt.grid(True)
plt.savefig('image-classification/results/mlp.png')
plt.show()
```
![Image](results/mlp.png)
The accuracy is increasing with each epochs without converging, we could increase the learning rate to speed up the training and inscrease the numbers of epoch to see what would be our maximum accuracy.
For 100 epochs and a learning rate of 0.1 we got a test accuracy of 0.13.
For 300 epochs and a learning rate of 0.1 we increased the training accuracy to 0.15991 and we got a test accuracy of 0.155
\ No newline at end of file
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment