{ "cells": [ { "cell_type": "markdown", "id": "7edf7168", "metadata": {}, "source": [ "# TD2: Deep learning" ] }, { "cell_type": "markdown", "id": "fbb8c8df", "metadata": {}, "source": [ "In this TD, you must modify this notebook to answer the questions. To do this,\n", "\n", "1. Fork this repository\n", "2. Clone your forked repository on your local computer\n", "3. Answer the questions\n", "4. Commit and push regularly\n", "\n", "The last commit is due on Sunday, December 1, 11:59 PM. Later commits will not be taken into account." ] }, { "cell_type": "markdown", "id": "3d167a29", "metadata": {}, "source": [ "Install and test PyTorch from https://pytorch.org/get-started/locally." ] }, { "cell_type": "code", "execution_count": 3, "id": "330a42f5", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Requirement already satisfied: torch in c:\\users\\lucil\\anaconda3\\lib\\site-packages (2.1.0)Note: you may need to restart the kernel to use updated packages.\n", "\n", "Requirement already satisfied: torchvision in c:\\users\\lucil\\anaconda3\\lib\\site-packages (0.16.0)\n", "Requirement already satisfied: filelock in c:\\users\\lucil\\anaconda3\\lib\\site-packages (from torch) (3.9.0)\n", "Requirement already satisfied: typing-extensions in c:\\users\\lucil\\anaconda3\\lib\\site-packages (from torch) (4.7.1)\n", "Requirement already satisfied: sympy in c:\\users\\lucil\\anaconda3\\lib\\site-packages (from torch) (1.11.1)\n", "Requirement already satisfied: networkx in c:\\users\\lucil\\anaconda3\\lib\\site-packages (from torch) (3.1)\n", "Requirement already satisfied: jinja2 in c:\\users\\lucil\\anaconda3\\lib\\site-packages (from torch) (3.1.2)\n", "Requirement already satisfied: fsspec in c:\\users\\lucil\\anaconda3\\lib\\site-packages (from torch) (2023.4.0)\n", "Requirement already satisfied: numpy in c:\\users\\lucil\\anaconda3\\lib\\site-packages (from torchvision) (1.24.3)\n", "Requirement already satisfied: requests in c:\\users\\lucil\\anaconda3\\lib\\site-packages (from torchvision) (2.31.0)\n", "Requirement already satisfied: pillow!=8.3.*,>=5.3.0 in c:\\users\\lucil\\anaconda3\\lib\\site-packages (from torchvision) (9.4.0)\n", "Requirement already satisfied: MarkupSafe>=2.0 in c:\\users\\lucil\\anaconda3\\lib\\site-packages (from jinja2->torch) (2.1.1)\n", "Requirement already satisfied: charset-normalizer<4,>=2 in c:\\users\\lucil\\anaconda3\\lib\\site-packages (from requests->torchvision) (2.0.4)\n", "Requirement already satisfied: idna<4,>=2.5 in c:\\users\\lucil\\anaconda3\\lib\\site-packages (from requests->torchvision) (3.4)\n", "Requirement already satisfied: urllib3<3,>=1.21.1 in c:\\users\\lucil\\anaconda3\\lib\\site-packages (from requests->torchvision) (1.26.16)\n", "Requirement already satisfied: certifi>=2017.4.17 in c:\\users\\lucil\\anaconda3\\lib\\site-packages (from requests->torchvision) (2023.7.22)\n", "Requirement already satisfied: mpmath>=0.19 in c:\\users\\lucil\\anaconda3\\lib\\site-packages (from sympy->torch) (1.3.0)\n" ] } ], "source": [ "%pip install torch torchvision" ] }, { "cell_type": "markdown", "id": "0882a636", "metadata": {}, "source": [ "\n", "To test run the following code" ] }, { "cell_type": "code", "execution_count": 4, "id": "b1950f0a", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "tensor([[ 0.3653, 0.6776, 1.4290, 1.3045, -0.1440, -1.9016, 0.1427, 0.6754,\n", " 0.0791, 0.6423],\n", " [-1.3009, 0.1227, 0.4001, 0.6688, 0.1672, -0.5949, 0.3957, -0.6071,\n", " -0.7747, 0.6197],\n", " [-0.7347, -1.5540, 2.3525, 0.1084, 0.1178, 0.5596, 0.6267, 2.1786,\n", " -0.5310, -0.6559],\n", " [ 0.6326, -1.0263, 0.3332, -0.1291, 0.1675, -0.1014, 1.3175, 0.3264,\n", " -0.1400, 0.7431],\n", " [ 0.4699, 0.9845, -1.4050, 1.1468, 0.7983, 1.0263, -1.6672, 0.1562,\n", " -0.0875, -1.9664],\n", " [-0.3761, -0.8523, 1.5731, -2.0885, -1.5779, 0.6759, 0.4770, 1.5133,\n", " -1.4350, -0.5716],\n", " [ 0.0985, -0.1337, -0.3850, 0.3503, -0.4130, -0.7820, -1.1305, 1.0061,\n", " 0.0298, -1.4626],\n", " [-0.0387, -1.7999, -2.1245, 0.2555, 0.1214, 0.5655, 0.5005, 1.0409,\n", " 0.8113, -0.2322],\n", " [ 2.1456, 0.3775, 0.8248, 0.8468, 0.8631, -0.0429, -1.5679, -0.6221,\n", " -1.1605, 0.5963],\n", " [ 0.1601, 0.2023, -0.9813, 0.1316, 0.1114, -1.8421, 0.6188, -0.3290,\n", " 0.6238, 0.3155],\n", " [-0.3864, -0.5559, 0.4249, -1.0155, -0.9137, 0.1228, -0.3569, 1.1107,\n", " -0.5542, 1.2470],\n", " [-0.6112, -0.5138, 1.1420, -0.0729, 1.1220, -0.1792, 1.0880, 0.8450,\n", " 0.6158, -0.9575],\n", " [ 0.9272, 0.1329, 0.4858, -0.5643, -0.1636, -0.2209, 0.9413, 0.1729,\n", " 0.4400, 0.2477],\n", " [-0.2307, 2.0693, 0.0898, 1.8634, 0.1166, 0.2212, 0.9382, -0.6915,\n", " -1.9567, 0.2097]])\n", "AlexNet(\n", " (features): Sequential(\n", " (0): Conv2d(3, 64, kernel_size=(11, 11), stride=(4, 4), padding=(2, 2))\n", " (1): ReLU(inplace=True)\n", " (2): MaxPool2d(kernel_size=3, stride=2, padding=0, dilation=1, ceil_mode=False)\n", " (3): Conv2d(64, 192, kernel_size=(5, 5), stride=(1, 1), padding=(2, 2))\n", " (4): ReLU(inplace=True)\n", " (5): MaxPool2d(kernel_size=3, stride=2, padding=0, dilation=1, ceil_mode=False)\n", " (6): Conv2d(192, 384, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", " (7): ReLU(inplace=True)\n", " (8): Conv2d(384, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", " (9): ReLU(inplace=True)\n", " (10): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", " (11): ReLU(inplace=True)\n", " (12): MaxPool2d(kernel_size=3, stride=2, padding=0, dilation=1, ceil_mode=False)\n", " )\n", " (avgpool): AdaptiveAvgPool2d(output_size=(6, 6))\n", " (classifier): Sequential(\n", " (0): Dropout(p=0.5, inplace=False)\n", " (1): Linear(in_features=9216, out_features=4096, bias=True)\n", " (2): ReLU(inplace=True)\n", " (3): Dropout(p=0.5, inplace=False)\n", " (4): Linear(in_features=4096, out_features=4096, bias=True)\n", " (5): ReLU(inplace=True)\n", " (6): Linear(in_features=4096, out_features=1000, bias=True)\n", " )\n", ")\n" ] } ], "source": [ "import torch\n", "\n", "N, D = 14, 10\n", "x = torch.randn(N, D).type(torch.FloatTensor)\n", "print(x)\n", "\n", "from torchvision import models\n", "\n", "alexnet = models.alexnet()\n", "print(alexnet)" ] }, { "cell_type": "markdown", "id": "23f266da", "metadata": {}, "source": [ "## Exercise 1: CNN on CIFAR10\n", "\n", "The goal is to apply a Convolutional Neural Net (CNN) model on the CIFAR10 image dataset and test the accuracy of the model on the basis of image classification. Compare the Accuracy VS the neural network implemented during TD1.\n", "\n", "Have a look at the following documentation to be familiar with PyTorch.\n", "\n", "https://pytorch.org/tutorials/beginner/pytorch_with_examples.html\n", "\n", "https://pytorch.org/tutorials/beginner/deep_learning_60min_blitz.html" ] }, { "cell_type": "markdown", "id": "4ba1c82d", "metadata": {}, "source": [ "You can test if GPU is available on your machine and thus train on it to speed up the process" ] }, { "cell_type": "code", "execution_count": 5, "id": "6e18f2fd", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "CUDA is not available. Training on CPU ...\n" ] } ], "source": [ "import torch\n", "\n", "# check if CUDA is available\n", "train_on_gpu = torch.cuda.is_available()\n", "\n", "if not train_on_gpu:\n", " print(\"CUDA is not available. Training on CPU ...\")\n", "else:\n", " print(\"CUDA is available! Training on GPU ...\")" ] }, { "cell_type": "markdown", "id": "5cf214eb", "metadata": {}, "source": [ "Next we load the CIFAR10 dataset" ] }, { "cell_type": "code", "execution_count": 6, "id": "462666a2", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Files already downloaded and verified\n", "Files already downloaded and verified\n" ] } ], "source": [ "import numpy as np\n", "from torchvision import datasets, transforms\n", "from torch.utils.data.sampler import SubsetRandomSampler\n", "\n", "# number of subprocesses to use for data loading\n", "num_workers = 0\n", "# how many samples per batch to load\n", "batch_size = 20\n", "# percentage of training set to use as validation\n", "valid_size = 0.2\n", "\n", "# convert data to a normalized torch.FloatTensor\n", "transform = transforms.Compose(\n", " [transforms.ToTensor(), transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))]\n", ")\n", "\n", "# choose the training and test datasets\n", "train_data = datasets.CIFAR10(\"data\", train=True, download=True, transform=transform)\n", "test_data = datasets.CIFAR10(\"data\", train=False, download=True, transform=transform)\n", "\n", "# obtain training indices that will be used for validation\n", "num_train = len(train_data)\n", "indices = list(range(num_train))\n", "np.random.shuffle(indices)\n", "split = int(np.floor(valid_size * num_train))\n", "train_idx, valid_idx = indices[split:], indices[:split]\n", "\n", "# define samplers for obtaining training and validation batches\n", "train_sampler = SubsetRandomSampler(train_idx)\n", "valid_sampler = SubsetRandomSampler(valid_idx)\n", "\n", "# prepare data loaders (combine dataset and sampler)\n", "train_loader = torch.utils.data.DataLoader(\n", " train_data, batch_size=batch_size, sampler=train_sampler, num_workers=num_workers\n", ")\n", "valid_loader = torch.utils.data.DataLoader(\n", " train_data, batch_size=batch_size, sampler=valid_sampler, num_workers=num_workers\n", ")\n", "test_loader = torch.utils.data.DataLoader(\n", " test_data, batch_size=batch_size, num_workers=num_workers\n", ")\n", "\n", "# specify the image classes\n", "classes = [\n", " \"airplane\",\n", " \"automobile\",\n", " \"bird\",\n", " \"cat\",\n", " \"deer\",\n", " \"dog\",\n", " \"frog\",\n", " \"horse\",\n", " \"ship\",\n", " \"truck\",\n", "]" ] }, { "cell_type": "markdown", "id": "58ec3903", "metadata": {}, "source": [ "CNN definition (this one is an example)" ] }, { "cell_type": "code", "execution_count": 7, "id": "317bf070", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Net(\n", " (conv1): Conv2d(3, 6, kernel_size=(5, 5), stride=(1, 1))\n", " (pool): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)\n", " (conv2): Conv2d(6, 16, kernel_size=(5, 5), stride=(1, 1))\n", " (fc1): Linear(in_features=400, out_features=120, bias=True)\n", " (fc2): Linear(in_features=120, out_features=84, bias=True)\n", " (fc3): Linear(in_features=84, out_features=10, bias=True)\n", ")\n" ] } ], "source": [ "import torch.nn as nn\n", "import torch.nn.functional as F\n", "\n", "# define the CNN architecture\n", "\n", "\n", "class Net(nn.Module):\n", " def __init__(self):\n", " super(Net, self).__init__()\n", " self.conv1 = nn.Conv2d(3, 6, 5)\n", " self.pool = nn.MaxPool2d(2, 2)\n", " self.conv2 = nn.Conv2d(6, 16, 5)\n", " self.fc1 = nn.Linear(16 * 5 * 5, 120)\n", " self.fc2 = nn.Linear(120, 84)\n", " self.fc3 = nn.Linear(84, 10)\n", "\n", " def forward(self, x):\n", " x = self.pool(F.relu(self.conv1(x)))\n", " x = self.pool(F.relu(self.conv2(x)))\n", " x = x.view(-1, 16 * 5 * 5)\n", " x = F.relu(self.fc1(x))\n", " x = F.relu(self.fc2(x))\n", " x = self.fc3(x)\n", " return x\n", "\n", "\n", "# create a complete CNN\n", "model = Net()\n", "print(model)\n", "# move tensors to GPU if CUDA is available\n", "if train_on_gpu:\n", " model.cuda()" ] }, { "cell_type": "markdown", "id": "a2dc4974", "metadata": {}, "source": [ "Loss function and training using SGD (Stochastic Gradient Descent) optimizer" ] }, { "cell_type": "code", "execution_count": 8, "id": "4b53f229", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Epoch: 0 \tTraining Loss: 43.453638 \tValidation Loss: 38.117901\n", "Validation loss decreased (inf --> 38.117901). Saving model ...\n", "Epoch: 1 \tTraining Loss: 33.786905 \tValidation Loss: 30.608687\n", "Validation loss decreased (38.117901 --> 30.608687). Saving model ...\n", "Epoch: 2 \tTraining Loss: 29.978750 \tValidation Loss: 28.626190\n", "Validation loss decreased (30.608687 --> 28.626190). Saving model ...\n", "Epoch: 3 \tTraining Loss: 27.777584 \tValidation Loss: 27.198099\n", "Validation loss decreased (28.626190 --> 27.198099). Saving model ...\n", "Epoch: 4 \tTraining Loss: 26.117933 \tValidation Loss: 26.415911\n", "Validation loss decreased (27.198099 --> 26.415911). Saving model ...\n", "Epoch: 5 \tTraining Loss: 24.786261 \tValidation Loss: 24.554481\n", "Validation loss decreased (26.415911 --> 24.554481). Saving model ...\n", "Epoch: 6 \tTraining Loss: 23.703873 \tValidation Loss: 24.357461\n", "Validation loss decreased (24.554481 --> 24.357461). Saving model ...\n", "Epoch: 7 \tTraining Loss: 22.748076 \tValidation Loss: 24.332178\n", "Validation loss decreased (24.357461 --> 24.332178). Saving model ...\n", "Epoch: 8 \tTraining Loss: 21.790853 \tValidation Loss: 23.261406\n", "Validation loss decreased (24.332178 --> 23.261406). Saving model ...\n", "Epoch: 9 \tTraining Loss: 20.925274 \tValidation Loss: 23.353505\n", "Epoch: 10 \tTraining Loss: 20.174014 \tValidation Loss: 22.972180\n", "Validation loss decreased (23.261406 --> 22.972180). Saving model ...\n", "Epoch: 11 \tTraining Loss: 19.419566 \tValidation Loss: 22.647662\n", "Validation loss decreased (22.972180 --> 22.647662). Saving model ...\n", "Epoch: 12 \tTraining Loss: 18.719525 \tValidation Loss: 22.919457\n" ] }, { "ename": "KeyboardInterrupt", "evalue": "", "output_type": "error", "traceback": [ "\u001b[1;31m---------------------------------------------------------------------------\u001b[0m", "\u001b[1;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", "\u001b[1;32md:\\Users\\lucil\\Documents\\S9\\Apprentissage profond\\mod_4_6-td2\\TD2 Deep Learning.ipynb Cell 15\u001b[0m line \u001b[0;36m3\n\u001b[0;32m <a href='vscode-notebook-cell:/d%3A/Users/lucil/Documents/S9/Apprentissage%20profond/mod_4_6-td2/TD2%20Deep%20Learning.ipynb#X20sZmlsZQ%3D%3D?line=33'>34</a>\u001b[0m \u001b[39m# Validate the model\u001b[39;00m\n\u001b[0;32m <a href='vscode-notebook-cell:/d%3A/Users/lucil/Documents/S9/Apprentissage%20profond/mod_4_6-td2/TD2%20Deep%20Learning.ipynb#X20sZmlsZQ%3D%3D?line=34'>35</a>\u001b[0m model\u001b[39m.\u001b[39meval()\n\u001b[1;32m---> <a href='vscode-notebook-cell:/d%3A/Users/lucil/Documents/S9/Apprentissage%20profond/mod_4_6-td2/TD2%20Deep%20Learning.ipynb#X20sZmlsZQ%3D%3D?line=35'>36</a>\u001b[0m \u001b[39mfor\u001b[39;00m data, target \u001b[39min\u001b[39;00m valid_loader:\n\u001b[0;32m <a href='vscode-notebook-cell:/d%3A/Users/lucil/Documents/S9/Apprentissage%20profond/mod_4_6-td2/TD2%20Deep%20Learning.ipynb#X20sZmlsZQ%3D%3D?line=36'>37</a>\u001b[0m \u001b[39m# Move tensors to GPU if CUDA is available\u001b[39;00m\n\u001b[0;32m <a href='vscode-notebook-cell:/d%3A/Users/lucil/Documents/S9/Apprentissage%20profond/mod_4_6-td2/TD2%20Deep%20Learning.ipynb#X20sZmlsZQ%3D%3D?line=37'>38</a>\u001b[0m \u001b[39mif\u001b[39;00m train_on_gpu:\n\u001b[0;32m <a href='vscode-notebook-cell:/d%3A/Users/lucil/Documents/S9/Apprentissage%20profond/mod_4_6-td2/TD2%20Deep%20Learning.ipynb#X20sZmlsZQ%3D%3D?line=38'>39</a>\u001b[0m data, target \u001b[39m=\u001b[39m data\u001b[39m.\u001b[39mcuda(), target\u001b[39m.\u001b[39mcuda()\n", "File \u001b[1;32mc:\\Users\\lucil\\anaconda3\\Lib\\site-packages\\torch\\utils\\data\\dataloader.py:630\u001b[0m, in \u001b[0;36m_BaseDataLoaderIter.__next__\u001b[1;34m(self)\u001b[0m\n\u001b[0;32m 627\u001b[0m \u001b[39mif\u001b[39;00m \u001b[39mself\u001b[39m\u001b[39m.\u001b[39m_sampler_iter \u001b[39mis\u001b[39;00m \u001b[39mNone\u001b[39;00m:\n\u001b[0;32m 628\u001b[0m \u001b[39m# TODO(https://github.com/pytorch/pytorch/issues/76750)\u001b[39;00m\n\u001b[0;32m 629\u001b[0m \u001b[39mself\u001b[39m\u001b[39m.\u001b[39m_reset() \u001b[39m# type: ignore[call-arg]\u001b[39;00m\n\u001b[1;32m--> 630\u001b[0m data \u001b[39m=\u001b[39m \u001b[39mself\u001b[39m\u001b[39m.\u001b[39m_next_data()\n\u001b[0;32m 631\u001b[0m \u001b[39mself\u001b[39m\u001b[39m.\u001b[39m_num_yielded \u001b[39m+\u001b[39m\u001b[39m=\u001b[39m \u001b[39m1\u001b[39m\n\u001b[0;32m 632\u001b[0m \u001b[39mif\u001b[39;00m \u001b[39mself\u001b[39m\u001b[39m.\u001b[39m_dataset_kind \u001b[39m==\u001b[39m _DatasetKind\u001b[39m.\u001b[39mIterable \u001b[39mand\u001b[39;00m \\\n\u001b[0;32m 633\u001b[0m \u001b[39mself\u001b[39m\u001b[39m.\u001b[39m_IterableDataset_len_called \u001b[39mis\u001b[39;00m \u001b[39mnot\u001b[39;00m \u001b[39mNone\u001b[39;00m \u001b[39mand\u001b[39;00m \\\n\u001b[0;32m 634\u001b[0m \u001b[39mself\u001b[39m\u001b[39m.\u001b[39m_num_yielded \u001b[39m>\u001b[39m \u001b[39mself\u001b[39m\u001b[39m.\u001b[39m_IterableDataset_len_called:\n", "File \u001b[1;32mc:\\Users\\lucil\\anaconda3\\Lib\\site-packages\\torch\\utils\\data\\dataloader.py:674\u001b[0m, in \u001b[0;36m_SingleProcessDataLoaderIter._next_data\u001b[1;34m(self)\u001b[0m\n\u001b[0;32m 672\u001b[0m \u001b[39mdef\u001b[39;00m \u001b[39m_next_data\u001b[39m(\u001b[39mself\u001b[39m):\n\u001b[0;32m 673\u001b[0m index \u001b[39m=\u001b[39m \u001b[39mself\u001b[39m\u001b[39m.\u001b[39m_next_index() \u001b[39m# may raise StopIteration\u001b[39;00m\n\u001b[1;32m--> 674\u001b[0m data \u001b[39m=\u001b[39m \u001b[39mself\u001b[39m\u001b[39m.\u001b[39m_dataset_fetcher\u001b[39m.\u001b[39mfetch(index) \u001b[39m# may raise StopIteration\u001b[39;00m\n\u001b[0;32m 675\u001b[0m \u001b[39mif\u001b[39;00m \u001b[39mself\u001b[39m\u001b[39m.\u001b[39m_pin_memory:\n\u001b[0;32m 676\u001b[0m data \u001b[39m=\u001b[39m _utils\u001b[39m.\u001b[39mpin_memory\u001b[39m.\u001b[39mpin_memory(data, \u001b[39mself\u001b[39m\u001b[39m.\u001b[39m_pin_memory_device)\n", "File \u001b[1;32mc:\\Users\\lucil\\anaconda3\\Lib\\site-packages\\torch\\utils\\data\\_utils\\fetch.py:51\u001b[0m, in \u001b[0;36m_MapDatasetFetcher.fetch\u001b[1;34m(self, possibly_batched_index)\u001b[0m\n\u001b[0;32m 49\u001b[0m data \u001b[39m=\u001b[39m \u001b[39mself\u001b[39m\u001b[39m.\u001b[39mdataset\u001b[39m.\u001b[39m__getitems__(possibly_batched_index)\n\u001b[0;32m 50\u001b[0m \u001b[39melse\u001b[39;00m:\n\u001b[1;32m---> 51\u001b[0m data \u001b[39m=\u001b[39m [\u001b[39mself\u001b[39m\u001b[39m.\u001b[39mdataset[idx] \u001b[39mfor\u001b[39;00m idx \u001b[39min\u001b[39;00m possibly_batched_index]\n\u001b[0;32m 52\u001b[0m \u001b[39melse\u001b[39;00m:\n\u001b[0;32m 53\u001b[0m data \u001b[39m=\u001b[39m \u001b[39mself\u001b[39m\u001b[39m.\u001b[39mdataset[possibly_batched_index]\n", "File \u001b[1;32mc:\\Users\\lucil\\anaconda3\\Lib\\site-packages\\torch\\utils\\data\\_utils\\fetch.py:51\u001b[0m, in \u001b[0;36m<listcomp>\u001b[1;34m(.0)\u001b[0m\n\u001b[0;32m 49\u001b[0m data \u001b[39m=\u001b[39m \u001b[39mself\u001b[39m\u001b[39m.\u001b[39mdataset\u001b[39m.\u001b[39m__getitems__(possibly_batched_index)\n\u001b[0;32m 50\u001b[0m \u001b[39melse\u001b[39;00m:\n\u001b[1;32m---> 51\u001b[0m data \u001b[39m=\u001b[39m [\u001b[39mself\u001b[39m\u001b[39m.\u001b[39mdataset[idx] \u001b[39mfor\u001b[39;00m idx \u001b[39min\u001b[39;00m possibly_batched_index]\n\u001b[0;32m 52\u001b[0m \u001b[39melse\u001b[39;00m:\n\u001b[0;32m 53\u001b[0m data \u001b[39m=\u001b[39m \u001b[39mself\u001b[39m\u001b[39m.\u001b[39mdataset[possibly_batched_index]\n", "File \u001b[1;32mc:\\Users\\lucil\\anaconda3\\Lib\\site-packages\\torchvision\\datasets\\cifar.py:118\u001b[0m, in \u001b[0;36mCIFAR10.__getitem__\u001b[1;34m(self, index)\u001b[0m\n\u001b[0;32m 115\u001b[0m img \u001b[39m=\u001b[39m Image\u001b[39m.\u001b[39mfromarray(img)\n\u001b[0;32m 117\u001b[0m \u001b[39mif\u001b[39;00m \u001b[39mself\u001b[39m\u001b[39m.\u001b[39mtransform \u001b[39mis\u001b[39;00m \u001b[39mnot\u001b[39;00m \u001b[39mNone\u001b[39;00m:\n\u001b[1;32m--> 118\u001b[0m img \u001b[39m=\u001b[39m \u001b[39mself\u001b[39m\u001b[39m.\u001b[39mtransform(img)\n\u001b[0;32m 120\u001b[0m \u001b[39mif\u001b[39;00m \u001b[39mself\u001b[39m\u001b[39m.\u001b[39mtarget_transform \u001b[39mis\u001b[39;00m \u001b[39mnot\u001b[39;00m \u001b[39mNone\u001b[39;00m:\n\u001b[0;32m 121\u001b[0m target \u001b[39m=\u001b[39m \u001b[39mself\u001b[39m\u001b[39m.\u001b[39mtarget_transform(target)\n", "File \u001b[1;32mc:\\Users\\lucil\\anaconda3\\Lib\\site-packages\\torchvision\\transforms\\transforms.py:95\u001b[0m, in \u001b[0;36mCompose.__call__\u001b[1;34m(self, img)\u001b[0m\n\u001b[0;32m 93\u001b[0m \u001b[39mdef\u001b[39;00m \u001b[39m__call__\u001b[39m(\u001b[39mself\u001b[39m, img):\n\u001b[0;32m 94\u001b[0m \u001b[39mfor\u001b[39;00m t \u001b[39min\u001b[39;00m \u001b[39mself\u001b[39m\u001b[39m.\u001b[39mtransforms:\n\u001b[1;32m---> 95\u001b[0m img \u001b[39m=\u001b[39m t(img)\n\u001b[0;32m 96\u001b[0m \u001b[39mreturn\u001b[39;00m img\n", "File \u001b[1;32mc:\\Users\\lucil\\anaconda3\\Lib\\site-packages\\torchvision\\transforms\\transforms.py:137\u001b[0m, in \u001b[0;36mToTensor.__call__\u001b[1;34m(self, pic)\u001b[0m\n\u001b[0;32m 129\u001b[0m \u001b[39mdef\u001b[39;00m \u001b[39m__call__\u001b[39m(\u001b[39mself\u001b[39m, pic):\n\u001b[0;32m 130\u001b[0m \u001b[39m \u001b[39m\u001b[39m\"\"\"\u001b[39;00m\n\u001b[0;32m 131\u001b[0m \u001b[39m Args:\u001b[39;00m\n\u001b[0;32m 132\u001b[0m \u001b[39m pic (PIL Image or numpy.ndarray): Image to be converted to tensor.\u001b[39;00m\n\u001b[1;32m (...)\u001b[0m\n\u001b[0;32m 135\u001b[0m \u001b[39m Tensor: Converted image.\u001b[39;00m\n\u001b[0;32m 136\u001b[0m \u001b[39m \"\"\"\u001b[39;00m\n\u001b[1;32m--> 137\u001b[0m \u001b[39mreturn\u001b[39;00m F\u001b[39m.\u001b[39mto_tensor(pic)\n", "File \u001b[1;32mc:\\Users\\lucil\\anaconda3\\Lib\\site-packages\\torchvision\\transforms\\functional.py:174\u001b[0m, in \u001b[0;36mto_tensor\u001b[1;34m(pic)\u001b[0m\n\u001b[0;32m 172\u001b[0m img \u001b[39m=\u001b[39m img\u001b[39m.\u001b[39mpermute((\u001b[39m2\u001b[39m, \u001b[39m0\u001b[39m, \u001b[39m1\u001b[39m))\u001b[39m.\u001b[39mcontiguous()\n\u001b[0;32m 173\u001b[0m \u001b[39mif\u001b[39;00m \u001b[39misinstance\u001b[39m(img, torch\u001b[39m.\u001b[39mByteTensor):\n\u001b[1;32m--> 174\u001b[0m \u001b[39mreturn\u001b[39;00m img\u001b[39m.\u001b[39mto(dtype\u001b[39m=\u001b[39mdefault_float_dtype)\u001b[39m.\u001b[39mdiv(\u001b[39m255\u001b[39m)\n\u001b[0;32m 175\u001b[0m \u001b[39melse\u001b[39;00m:\n\u001b[0;32m 176\u001b[0m \u001b[39mreturn\u001b[39;00m img\n", "\u001b[1;31mKeyboardInterrupt\u001b[0m: " ] } ], "source": [ "import torch.optim as optim\n", "\n", "criterion = nn.CrossEntropyLoss() # specify loss function\n", "optimizer = optim.SGD(model.parameters(), lr=0.01) # specify optimizer\n", "\n", "n_epochs = 30 # number of epochs to train the model\n", "train_loss_list = [] # list to store loss to visualize\n", "valid_loss_min = np.Inf # track change in validation loss\n", "\n", "for epoch in range(n_epochs):\n", " # Keep track of training and validation loss\n", " train_loss = 0.0\n", " valid_loss = 0.0\n", "\n", " # Train the model\n", " model.train()\n", " for data, target in train_loader:\n", " # Move tensors to GPU if CUDA is available\n", " if train_on_gpu:\n", " data, target = data.cuda(), target.cuda()\n", " # Clear the gradients of all optimized variables\n", " optimizer.zero_grad()\n", " # Forward pass: compute predicted outputs by passing inputs to the model\n", " output = model(data)\n", " # Calculate the batch loss\n", " loss = criterion(output, target)\n", " # Backward pass: compute gradient of the loss with respect to model parameters\n", " loss.backward()\n", " # Perform a single optimization step (parameter update)\n", " optimizer.step()\n", " # Update training loss\n", " train_loss += loss.item() * data.size(0)\n", "\n", " # Validate the model\n", " model.eval()\n", " for data, target in valid_loader:\n", " # Move tensors to GPU if CUDA is available\n", " if train_on_gpu:\n", " data, target = data.cuda(), target.cuda()\n", " # Forward pass: compute predicted outputs by passing inputs to the model\n", " output = model(data)\n", " # Calculate the batch loss\n", " loss = criterion(output, target)\n", " # Update average validation loss\n", " valid_loss += loss.item() * data.size(0)\n", "\n", " # Calculate average losses\n", " train_loss = train_loss / len(train_loader)\n", " valid_loss = valid_loss / len(valid_loader)\n", " train_loss_list.append(train_loss)\n", "\n", " # Print training/validation statistics\n", " print(\n", " \"Epoch: {} \\tTraining Loss: {:.6f} \\tValidation Loss: {:.6f}\".format(\n", " epoch, train_loss, valid_loss\n", " )\n", " )\n", "\n", " # Save model if validation loss has decreased\n", " if valid_loss <= valid_loss_min:\n", " print(\n", " \"Validation loss decreased ({:.6f} --> {:.6f}). Saving model ...\".format(\n", " valid_loss_min, valid_loss\n", " )\n", " )\n", " torch.save(model.state_dict(), \"model_cifar.pt\")\n", " valid_loss_min = valid_loss" ] }, { "cell_type": "markdown", "id": "13e1df74", "metadata": {}, "source": [ "Does overfit occur? If so, do an early stopping." ] }, { "cell_type": "code", "execution_count": 9, "id": "d39df818", "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjMAAAHFCAYAAAAHcXhbAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8pXeV/AAAACXBIWXMAAA9hAAAPYQGoP6dpAABJ1UlEQVR4nO3deXxU1d3H8e9km+wBErJBCAHCDgHZ90VBFHHBqigq6PO4IkLVipVatwpqrY+tKEpbqVYp1KoIWhBUVgEBWQVlXwIhJISEhOzLef4IGWdkMYQkdyb5vF+vedXce2fml4lNvt5zfufYjDFGAAAAHsrL6gIAAAAuBWEGAAB4NMIMAADwaIQZAADg0QgzAADAoxFmAACARyPMAAAAj0aYAQAAHo0wAwAAPBphBqiif/zjH7LZbI6Hj4+PmjZtqrvuuktHjx6t1vcqKirS/fffr5iYGHl7e6tLly7V+vo4v3nz5qlDhw4KCAiQzWbTli1bznnd8uXLHf8u/OMf/zjnNUOHDpXNZlPz5s2rtcbmzZtr/PjxVXquzWbTM88884vXvffeexozZozatGkjLy+vav8egEvhY3UBgKebPXu22rZtq/z8fK1cuVLTp0/XihUrtH37dgUFBVXLe8ycOVNvv/22Xn/9dXXr1k3BwcHV8rq4sPT0dN1xxx0aMWKE3nzzTdntdrVu3fqCzwkJCdHf//73s8LFgQMHtHz5coWGhtZgxTXnn//8p1JTU9WzZ0+VlZWpuLjY6pIAB8IMcIk6duyo7t27S5KGDBmi0tJSPf/885o/f77Gjh17Sa+dl5enwMBAff/99woICNBDDz1UHSVLkvLz8xUQEFBtr1cX7d69W8XFxbr99ts1aNCgSj3nlltu0d/+9jft2bNHiYmJjuPvvPOOmjRpok6dOmnnzp01VXKN+eKLL+TlVX4z/5prrtH3339vcUXATxhmAqpZ7969JUmHDh2SJBlj9Oabb6pLly4KCAhQw4YN9atf/Ur79+93ed7gwYPVsWNHrVy5Un379lVgYKDuvvtu2Ww2/e1vf1N+fv5ZwxgFBQX67W9/q4SEBPn5+alJkyaaMGGCsrKyXF67efPmuuaaa/Txxx+ra9eu8vf317PPPusYGpkzZ46mTJmimJgYBQcHa9SoUTp+/LhycnJ07733KiIiQhEREbrrrrt0+vRpl9d+4403NHDgQEVGRiooKEidOnXSyy+/fNZ/uVd8fxs2bNCAAQMUGBioFi1a6MUXX1RZWZnLtVlZWXr00UfVokUL2e12RUZG6uqrr9aPP/7ouKaoqEh/+MMf1LZtW9ntdjVu3Fh33XWX0tPTK/VzWrBggfr06aPAwECFhIRo2LBhWrt2reP8+PHj1b9/f0nlAcVms2nw4MG/+LrDhg1TXFyc3nnnHcexsrIyvfvuuxo3bpwjEDir7M+xuLhYjz/+uKKjoxUYGKj+/ftr/fr156wjNTVV9913n5o2bSo/Pz8lJCTo2WefVUlJSSU+nbOdq27AbRgAVTJ79mwjyWzYsMHl+J///GcjycyaNcsYY8w999xjfH19zaOPPmoWL15s5syZY9q2bWuioqJMamqq43mDBg0yjRo1MnFxceb11183y5YtMytWrDBr1641V199tQkICDBr1641a9euNWlpaaasrMxceeWVxsfHxzz11FNmyZIl5pVXXjFBQUGma9eupqCgwPHa8fHxJiYmxrRo0cK88847ZtmyZWb9+vVm2bJlRpKJj48348ePN4sXLzZvvfWWCQ4ONkOGDDHDhg0zjz32mFmyZIl56aWXjLe3t5k4caLL9/vrX//azJw50yxevNh8/fXX5v/+7/9MRESEueuuu1yuGzRokAkPDzeJiYnmrbfeMkuXLjUPPvigkWTeffddx3XZ2dmmQ4cOJigoyDz33HPmiy++MB999JGZNGmS+frrr40xxpSWlpoRI0aYoKAg8+yzz5qlS5eav/3tb6ZJkyamffv2Ji8v74I/uw8++MBIMsOHDzfz58838+bNM926dTN+fn5m1apVxhhj9u7da9544w0jyUybNs2sXbvW7Nix47yvWfFZfvjhh+app54ysbGxpqSkxBhjzKJFi4zNZjN79+41I0eONPHx8Y7nXczPcdy4ccZms5nf/OY3ZsmSJebVV181TZo0MaGhoWbcuHGO644dO2bi4uJMfHy8efvtt82XX35pnn/+eWO328348eNd6pZknn766Qt+Xj/38+8BsBphBqiiijCzbt06U1xcbHJycsxnn31mGjdubEJCQkxqaqpZu3atkWT+9Kc/uTw3OTnZBAQEmMcff9xxbNCgQUaS+eqrr856r3HjxpmgoCCXY4sXLzaSzMsvv+xyfN68eS5hypjyMOPt7W127drlcm3FH+BRo0a5HJ88ebKRZB5++GGX49dff71p1KjReT+T0tJSU1xcbN577z3j7e1tTp48edb39+2337o8p3379ubKK690fP3cc88ZSWbp0qXnfZ9//etfRpL56KOPXI5v2LDBSDJvvvnmBWuMjY01nTp1MqWlpY7jOTk5JjIy0vTt29dxzDmg/BLna/fv329sNpv57LPPjDHG3HTTTWbw4MHGmLODQGV/jj/88IORZH7961+7XFcRzJzDzH333WeCg4PNoUOHXK595ZVXjCSXUEaYQV3AfUPgEvXu3Vu+vr4KCQnRNddco+joaC1atEhRUVH67LPPZLPZdPvtt6ukpMTxiI6OVlJSkpYvX+7yWg0bNtTQoUMr9b5ff/21JJ010fSmm25SUFCQvvrqK5fjnTt3Pu/k1Wuuucbl63bt2kmSRo4cedbxkydPugw1bd68Wddee63Cw8Pl7e0tX19f3XnnnSotLdXu3btdnh8dHa2ePXueVVfFkJwkLVq0SK1bt9YVV1xxvm9dn332mRo0aKBRo0a5fK5dunRRdHT0WZ+rs127diklJUV33HGHy9BJcHCwbrzxRq1bt055eXnnfX5lJCQkaPDgwXrnnXeUkZGhTz/9VHffffc5r63sz3HZsmWSdNY8rJtvvlk+Pq7THz/77DMNGTJEsbGxLp/PVVddJUlasWLFJX1/gLthAjBwid577z21a9dOPj4+ioqKUkxMjOPc8ePHZYxRVFTUOZ/bokULl6+dn/tLMjIy5OPjo8aNG7sct9lsio6OVkZGRqVfu1GjRi5f+/n5XfB4QUGBgoODdfjwYQ0YMEBt2rTRn//8ZzVv3lz+/v5av369JkyYoPz8fJfnh4eHn/Xedrvd5br09HQ1a9bsvLVK5Z9rVlaWo56fO3HixHmfW/G5nOvziI2NVVlZmTIzMxUYGHjBGn7J//zP/+iuu+7Sq6++qoCAAP3qV786bz2V+TlW/G90dLTLdT4+Pmd9rsePH9fChQvl6+t7zve80OcDeCLCDHCJ2rVr5+hm+rmIiAjZbDatWrVKdrv9rPM/P2az2Sr9vuHh4SopKVF6errLH0JjjFJTU9WjR48qv3ZlzZ8/X7m5ufr4448VHx/vOH6+tVgqo3Hjxjpy5MgFr4mIiFB4eLgWL158zvMhISHnfW7FH/5jx46ddS4lJUVeXl5q2LDhRVR8bqNHj9aECRP04osv6p577jlv51hlf44VdaempqpJkyaO60pKSs4KrhEREercubNeeOGFc75nbGzsJX1vgLthmAmoQddcc42MMTp69Ki6d+9+1qNTp05Vfu3LL79ckvT++++7HP/oo4+Um5vrOF+TKgKScygzxuivf/1rlV/zqquu0u7dux3DL+dyzTXXKCMjQ6Wlpef8XNu0aXPe57Zp00ZNmjTRnDlzZIxxHM/NzdVHH33k6HC6VAEBAfr973+vUaNG6YEHHjjvdZX9OVZ0Un3wwQcu1/373/8+q0OponW6ZcuW5/x8CDOoa7gzA9Sgfv366d5779Vdd92ljRs3auDAgQoKCtKxY8e0evVqderU6YJ/6C5k2LBhuvLKKzVlyhRlZ2erX79+2rZtm55++ml17dpVd9xxRzV/N+euwc/PT7feeqsef/xxFRQUaObMmcrMzKzya06ePFnz5s3TddddpyeeeEI9e/ZUfn6+VqxYoWuuuUZDhgzRmDFj9MEHH+jqq6/WpEmT1LNnT/n6+urIkSNatmyZrrvuOt1www3nfH0vLy+9/PLLGjt2rK655hrdd999Kiws1B//+EdlZWXpxRdfrHLtP/fII4/okUceueA1lf05tmvXTrfffrtee+01+fr66oorrtD333+vV1555ayF+J577jktXbpUffv21cMPP6w2bdqooKBABw8e1H//+1+99dZbatq06UV9Lzt37nSsj5Oamqq8vDz95z//kSS1b99e7du3v6jXA6qVlbOPAU92vtbsc3nnnXdMr169TFBQkAkICDAtW7Y0d955p9m4caPjmkGDBpkOHTqc8/nn6mYyxpj8/HwzZcoUEx8fb3x9fU1MTIx54IEHTGZmpst18fHxZuTIkWc9/3zdOuf73p5++mkjyaSnpzuOLVy40CQlJRl/f3/TpEkT85vf/MYsWrTISDLLli37xe9v3LhxZ3XGZGZmmkmTJplmzZoZX19fExkZaUaOHGl+/PFHxzXFxcXmlVdecbx3cHCwadu2rbnvvvvMnj17znqfn5s/f77p1auX8ff3N0FBQebyyy8333zzTaU+n3Op7LXn6gSq7M+xsLDQPProoyYyMtL4+/ub3r17m7Vr15r4+HiXbiZjjElPTzcPP/ywSUhIML6+vqZRo0amW7duZurUqeb06dOO61TJbqaKn/25HhfbDQVUN5sxTvdZAQAAPAxzZgAAgEcjzAAAAI9GmAEAAB6NMAMAADwaYQYAAHg0wgwAAPBodX7RvLKyMqWkpCgkJKRGlnMHAADVzxijnJwcxcbGumwKey51PsykpKQoLi7O6jIAAEAVJCcn/+KK1XU+zFRsOJecnHzWkt8AAMA9ZWdnKy4u7oIbx1ao82GmYmgpNDSUMAMAgIepzBQRJgADAACPRpgBAAAejTADAAA8GmEGAAB4NMIMAADwaIQZAADg0QgzAADAoxFmAACARyPMAAAAj0aYAQAAHo0wAwAAPBphBgAAeDTCzCU4cCJXKVn5VpcBAEC9Rpipouc/26khryzXu2sPWl0KAAD1GmGmijo3DZMkLf8x3eJKAACo3wgzVTQwsbG8bNKu4zkMNQEAYCHCTBU1DPJTUlwDSdKK3dydAQDAKoSZSzCkTaQkafmuNIsrAQCg/iLMXILBbRpLklbvOaGikjKLqwEAoH4izFyCjrFhigj2U25RqTYeOml1OQAA1EuEmUvg5WXTwMTyuzMrdjFvBgAAKxBmLtHgthXzZggzAABYgTBziQYmRtCiDQCAhQgzl6hBoJ+6nGnR5u4MAAC1jzBTDQbTog0AgGUIM9WgYr2Zb/bSog0AQG0jzFSDDrGhP7VoH6RFGwCA2kSYqQZeXjYNbF3eor2crQ0AAKhVhJlqwrwZAACsQZipJhUt2ruPn9ZRWrQBAKg1hJlq0iDQT12bNZTE3RkAAGoTYaYaDa6YN8N6MwAA1BrCTDWqmDezhhZtAABqDWGmGpW3aNtp0QYAoBYRZqqRl5dNg84MNS1j3gwAALWCMFPNBrdh3gwAALWJMFPNBpxp0d6TRos2AAC1gTBTzRoE+ukyWrQBAKg1hJkaUDHUtOxHhpoAAKhphJka4GjR3ndChSWlFlcDAEDdRpipAe1jylu084pKtfFgptXlAABQpxFmaoCXl82pq4l5MwAA1CTCTA1xzJuhRRsAgBpFmKkhA1o1lpdN2pt2Wkcy86wuBwCAOoswU0PCAn2dWrS5OwMAQE0hzNSgIW3Lu5oIMwAA1BzCTA2q2KeJFm0AAGoOYaYGdYgNVeOQ8hbtDQdo0QYAoCYQZmqQzfbTLtq0aAMAUDMIMzVsyJnVgJcRZgAAqBGEmRrWPzFC3l427UvPVfJJWrQBAKhuhJkaFhbgq8uaNZAkLd9NVxMAANWNMFMLKjaeXMFQEwAA1Y4wUwsqtjb4Zm+GCopp0QYAoDoRZmpB+5hQRYbYlV9cqg0HT1pdDgAAdQphpha4tmgzbwYAgOpEmKklP21twLwZAACqk9uEmenTp8tms2ny5MmOY8YYPfPMM4qNjVVAQIAGDx6sHTt2WFfkJejXihZtAABqgluEmQ0bNmjWrFnq3Lmzy/GXX35Zr776qmbMmKENGzYoOjpaw4YNU05OjkWVVl1YgK+6OXbR5u4MAADVxfIwc/r0aY0dO1Z//etf1bBhQ8dxY4xee+01TZ06VaNHj1bHjh317rvvKi8vT3PmzLGw4qob1IZ5MwAAVDfLw8yECRM0cuRIXXHFFS7HDxw4oNTUVA0fPtxxzG63a9CgQVqzZs15X6+wsFDZ2dkuD3dRsbXBmn20aAMAUF0sDTNz587Vpk2bNH369LPOpaamSpKioqJcjkdFRTnOncv06dMVFhbmeMTFxVVv0ZegXUyIokLLW7TXH6BFGwCA6mBZmElOTtakSZP0/vvvy9/f/7zX2Ww2l6+NMWcdc/bb3/5Wp06dcjySk5OrreZLRYs2AADVz7Iw89133yktLU3dunWTj4+PfHx8tGLFCv3lL3+Rj4+P447Mz+/CpKWlnXW3xpndbldoaKjLw51UbG2wfDeTgAEAqA6WhZnLL79c27dv15YtWxyP7t27a+zYsdqyZYtatGih6OhoLV261PGcoqIirVixQn379rWq7EtWsYv2/vRcHc6gRRsAgEvlY9Ubh4SEqGPHji7HgoKCFB4e7jg+efJkTZs2TYmJiUpMTNS0adMUGBio2267zYqSq0Wov6+6xTfU+gMntXx3mu7s09zqkgAA8GiWhZnKePzxx5Wfn68HH3xQmZmZ6tWrl5YsWaKQkBCrS7skg9s0Lg8zu9IJMwAAXCKbMcZYXURNys7OVlhYmE6dOuU282d2pmTr6r+skr+vl7b8frj8fb2tLgkAALdyMX+/LV9npj6qaNEuKC6jRRsAgEtEmLGAzWbT4NblXU3L2NoAAIBLQpixyOAzWxusYL0ZAAAuCWHGIv0SI+TjZdP+E7RoAwBwKQgzFqlo0ZZYQA8AgEtBmLFQxWrAy34kzAAAUFWEGQtVzJtZu59dtAEAqCrCjIXaRocoOtRfBcVl+pYWbQAAqoQwYyGbzea4O7OcFm0AAKqEMGOxn8IMLdoAAFQFYcZi/VqVt2gfOJGrQxm5VpcDAIDHIcxYLMS5RZu7MwAAXDTCjBsY0ra8RZt5MwAAXDzCjBuomDezZh8t2gAAXCzCjBtoE1Xeol1YUqZ1+zOsLgcAAI9CmHEDri3azJsBAOBiEGbcRMXWBit2E2YAALgYhBk30a9VuKNF++AJWrQBAKgswoybCPH3VffmFS3adDUBAFBZhBk3UjHUtJyhJgAAKo0w40aGnAkza2nRBgCg0ggzbqR1VLBiwspbtNfSog0AQKUQZtyIc4v2Clq0AQCoFMKMmxnUmq0NAAC4GIQZN9OvVbh8vW06mJGnA7RoAwDwiwgzbibE31fd4xtJ4u4MAACVQZhxQ2xtAABA5RFm3FDFejPr9tOiDQDALyHMuKHWUcGKpUUbAIBKIcy4IZvNpkEVqwH/yLwZAAAuhDDjphzzZtjaAACACyLMuKl+rSLk623TIVq0AQC4IMKMmwq2+6hHc1q0AQD4JYQZN1Yx1LSMFm0AAM6LMOPGnFu084to0QYA4FwIM24sMbK8RbuopEzraNEGAOCcCDNuzGazaXBbNp4EAOBCCDNubnDrn+bNGGMsrgYAAPdDmHFzfc+0aB8+SYs2AADnQphxc64t2nQ1AQDwc4QZDzCkYmsDVgMGAOAshBkPULHeDC3aAACcjTDjAVpFBqtJgwAVlZRp7f4TVpcDAIBbIcx4gPJdtM9sPMm8GQAAXBBmPIRj3gwt2gAAuCDMeIi+LcPl5+2lwyfztJ8WbQAAHAgzHiLI7qMeCQ0lMdQEAIAzwowH+Wmoia0NAACoQJjxIBUt2t8eOKm8ohKLqwEAwD0QZjxIy8ZOLdr72EUbAACJMONRbDab4+4M82YAAChHmPEwP21tkEaLNgAAIsx4nL6tylu0k0/m06INAIAIMx4n0M9HPRPKd9Fe9iNdTQAAEGY8UMW8mRXsog0AAGHGEw0+M2/m2/20aAMAYGmYmTlzpjp37qzQ0FCFhoaqT58+WrRokeP8+PHjZbPZXB69e/e2sGL30LJxkJo2DFBRKS3aAABYGmaaNm2qF198URs3btTGjRs1dOhQXXfdddqxY4fjmhEjRujYsWOOx3//+18LK3YPzi3ay1gNGABQz/lY+eajRo1y+fqFF17QzJkztW7dOnXo0EGSZLfbFR0dbUV5bm1w60i9v+6wYxdtm81mdUkAAFjCbebMlJaWau7cucrNzVWfPn0cx5cvX67IyEi1bt1a99xzj9LSuBMh/dSifSQzX/vSadEGANRflt6ZkaTt27erT58+KigoUHBwsD755BO1b99eknTVVVfppptuUnx8vA4cOKCnnnpKQ4cO1XfffSe73X7O1yssLFRhYaHj6+zs7Fr5PmpboJ+PerVopFV7Tmj5rjS1igy2uiQAACxh+Z2ZNm3aaMuWLVq3bp0eeOABjRs3Tjt37pQk3XLLLRo5cqQ6duyoUaNGadGiRdq9e7c+//zz877e9OnTFRYW5njExcXV1rdS6wa1ZmsDAAAsDzN+fn5q1aqVunfvrunTpyspKUl//vOfz3ltTEyM4uPjtWfPnvO+3m9/+1udOnXK8UhOTq6p0i1X0aK9/sBJ5RbSog0AqJ8sDzM/Z4xxGSZylpGRoeTkZMXExJz3+Xa73dHqXfGoq1o2DlJcI1q0AQD1m6Vh5sknn9SqVat08OBBbd++XVOnTtXy5cs1duxYnT59Wo899pjWrl2rgwcPavny5Ro1apQiIiJ0ww03WFm227DZbBrcuvzuDC3aAID6ytIJwMePH9cdd9yhY8eOKSwsTJ07d9bixYs1bNgw5efna/v27XrvvfeUlZWlmJgYDRkyRPPmzVNISIiVZbuVwW0a65/rDtGiDQCotywNM3//+9/Pey4gIEBffPFFLVbjmfq0LG/RPpqVr33pp9UqkqAHAKhf3G7ODC5ORYu2RFcTAKB+IszUARVdTcybAQDUR4SZOqBin6YNBzJp0QYA1DuEmTqgRcRPLdpraNEGANQzhJk6wGazaciZoablDDUBAOoZwkwdUTHUVNGiDQBAfUGYqSP6tIiQn095i/betNNWlwMAQK0hzNQRAX7e6pVAizYAoP4hzNQhjnkzu5k3AwCoPwgzdUjFvJn1B04qu6DY4moAAKgdhJk6JCEiSK0ig1VcavTsgp1WlwMAQK0gzNQhNptNL1zfUV426aNNRzR/81GrSwIAoMYRZuqYXi3CNXFooiTpd/O/16GMXIsrAgCgZhFm6qCJQ1upR/OGOl1Yoof/tVlFJWVWlwQAQI0hzNRBPt5eem1MV4X6+2jrkVN6deluq0sCAKDGEGbqqCYNAvTSjZ0lSW+t2KdVe1h7BgBQNxFm6rCrOsXotl7NJEmP/HurMk4XWlwRAADVjzBTxz01sr0SI4OVnlOoxz7cyr5NAIA6hzBTxwX4eev127rKz8dLy3ala/Y3B60uCQCAakWYqQfaRofqqZHtJEkvLvpR3x89ZXFFAABUH8JMPXF773gNax+lotIyPfyvzcotLLG6JAAAqgVhpp6w2Wx6+cbOig711/4TuXp24Q6rSwIAoFoQZuqRhkF+em1MF9ls0r83HtGCrSlWlwQAwCUjzNQzvVuEa+KQVpKkqR9vV/LJPIsrAgDg0hBm6qGHL09Ut/iGyiks0cNzN6u4lO0OAACeizBTD/l4e+nPY7ooxN9Hmw9n6bUv2e4AAOC5CDP1VNOGgXpxdPl2B28u36c1e09YXBEAAFVDmKnHRnaO0ZgecTJGmjxvi07mFlldEgAAF40wU8/9flR7tWwcpLScQv2G7Q4AAB6IMFPPBfr56PVbL5Oft5e++jFN7645aHVJAABcFMIM1D42VE9e3VaSNO2/P2pnSrbFFQEAUHmEGUiSxvVtrsvbRqqotEwT/7VJeUVsdwAA8AyEGUgq3+7gjzclKTLErn3puXpu4U6rSwIAoFIIM3BoFOSn124p3+5g7oZkfb7tmNUlAQDwiwgzcNG3VYQeHNxSkvTEx9t0JJPtDgAA7q1KYSY5OVlHjhxxfL1+/XpNnjxZs2bNqrbCYJ3JV7RW12YNlFNQoklzt6iE7Q4AAG6sSmHmtttu07JlyyRJqampGjZsmNavX68nn3xSzz33XLUWiNrn6+2lv4zpqhC7j747lKm/fLXH6pIAADivKoWZ77//Xj179pQk/fvf/1bHjh21Zs0azZkzR//4xz+qsz5YJK5RoKaN7iRJen3ZXq3dl2FxRQAAnFuVwkxxcbHsdrsk6csvv9S1114rSWrbtq2OHWPSaF0xKilWN3dvKmOkX8/boky2OwAAuKEqhZkOHTrorbfe0qpVq7R06VKNGDFCkpSSkqLw8PBqLRDWeubaDmrROEip2QV6/KNtbHcAAHA7VQozL730kt5++20NHjxYt956q5KSkiRJCxYscAw/oW4I9PPRX8Z0lZ+3l5buPK731x2yuiQAAFzYTBX/U7u0tFTZ2dlq2LCh49jBgwcVGBioyMjIaivwUmVnZyssLEynTp1SaGio1eV4rHdWH9Bzn+2Un4+XFjzUT22j+SwBADXnYv5+V+nOTH5+vgoLCx1B5tChQ3rttde0a9cutwoyqD539WuuoW0jVVRSpolzNiu/qNTqkgAAkFTFMHPdddfpvffekyRlZWWpV69e+tOf/qTrr79eM2fOrNYC4R5sNpv++KvOahxi156003r+c7Y7AAC4hyqFmU2bNmnAgAGSpP/85z+KiorSoUOH9N577+kvf/lLtRYI9xEebHdsdzDn28NatJ3ONQCA9aoUZvLy8hQSEiJJWrJkiUaPHi0vLy/17t1bhw4xQbQu69cqQvcPKt/uYMpH23Q0K9/iigAA9V2VwkyrVq00f/58JScn64svvtDw4cMlSWlpaUyyrQceGdZaSXENlF1QoslzN7PdAQDAUlUKM7///e/12GOPqXnz5urZs6f69OkjqfwuTdeuXau1QLgfX28vvT6mq4LtPtpwMFOvf73X6pIAAPVYlVuzU1NTdezYMSUlJcnLqzwTrV+/XqGhoWrbtm21FnkpaM2uOZ9uOapJc7fIyybNvbePeiY0srokAEAdcTF/v6scZiocOXJENptNTZo0uZSXqTGEmZr12Idb9Z/vjig2zF//nTRADQL9rC4JAFAH1Pg6M2VlZXruuecUFham+Ph4NWvWTA0aNNDzzz+vsjLmT9Qnz17bQQkRQUo5VaAnPtrOdgcAgFpXpTAzdepUzZgxQy+++KI2b96sTZs2adq0aXr99df11FNPVXeNcGNBdh+9fmtX+XrbtHhHquasP2x1SQCAeqZKw0yxsbF66623HLtlV/j000/14IMP6ujRo9VW4KVimKl2/G3Vfv3h8x9k9/HSwon91ToqxOqSAAAerMaHmU6ePHnOSb5t27bVyZMnq/KS8HB390vQ4DaNVXhmu4OCYrY7AADUjiqFmaSkJM2YMeOs4zNmzFDnzp0vuSh4Hi8vm165KUkRwXbtOp6jFz7/weqSAAD1hE9VnvTyyy9r5MiR+vLLL9WnTx/ZbDatWbNGycnJ+u9//1vdNcJDRATb9erNSbrznfX657pD6p8YoSs7RFtdFgCgjqvSnZlBgwZp9+7duuGGG5SVlaWTJ09q9OjR2rFjh2bPnl3dNcKDDGzdWPcNbCFJevw/25TCdgcAgBpWpTAjlU8CfuGFF/TRRx/p448/1h/+8AdlZmbq3XffrfRrzJw5U507d1ZoaKhCQ0PVp08fLVq0yHHeGKNnnnlGsbGxCggI0ODBg7Vjx46qloxa8ujwNurcNEyn8os1ed4WlZbRrg0AqDlVDjPVoWnTpnrxxRe1ceNGbdy4UUOHDtV1113nCCwvv/yyXn31Vc2YMUMbNmxQdHS0hg0bppycHCvLxi/w8/HSX8Z0VZCft9YfOKk3lrHdAQCg5lgaZkaNGqWrr75arVu3VuvWrfXCCy8oODhY69atkzFGr732mqZOnarRo0erY8eOevfdd5WXl6c5c+ZYWTYqoXlEkP5wQ0dJ0mtf7tbGg3S5AQBqhqVhxllpaanmzp2r3Nxc9enTRwcOHFBqaqpjR25JstvtGjRokNasWWNhpaisG7o21eiuTVRmpElzt+hUXrHVJQEA6qCL6mYaPXr0Bc9nZWVddAHbt29Xnz59VFBQoODgYH3yySdq3769I7BERUW5XB8VFaVDhw6d9/UKCwtVWFjo+Do7O/uia0L1ee76jtp0OFMHM/J03/sbNXNsNzUMYv8mAED1uag7M2FhYRd8xMfH684777yoAtq0aaMtW7Zo3bp1euCBBzRu3Djt3LnTcd5ms7lcb4w565iz6dOnu9QUFxd3UfWgegXbffT6rZcp0M9b6/af1KgZq/X90VNWlwUAqEMuedfs6nbFFVeoZcuWmjJlilq2bKlNmzapa9eujvPXXXedGjRocN6uqXPdmYmLi2M7A4vtSs3Rvf/cqEMZebL7eOmlGzvr+q7uudM6AMB6Nb6dQU0yxqiwsFAJCQmKjo7W0qVLHeeKioq0YsUK9e3b97zPt9vtjlbviges1yY6RAsm9HdseTB53hY9t3CnikvZZR0AcGmqtAJwdXnyySd11VVXKS4uTjk5OZo7d66WL1+uxYsXy2azafLkyZo2bZoSExOVmJioadOmKTAwULfddpuVZaOKwgJ99fdxPfTal7v1+td79c43B7Qj5ZTeGHuZIoLtVpcHAPBQloaZ48eP64477tCxY8cUFhamzp07a/HixRo2bJgk6fHHH1d+fr4efPBBZWZmqlevXlqyZIlCQtiR2VN5e9n06PA26hAbpkf/vUXfHjipUa+v1lu3d1NSXAOrywMAeCC3mzNT3S5mzA21a29aju7953fan54rPx8vvXB9R93UnQnbAAAPnzOD+qNVZIjmT+inK9pFqaikTL/5zzY9Nf97FZUwjwYAUHmEGVgq1N9Xs+7opkeGtZbNJv1z3SHd9td1SsspsLo0AICHIMzAcl5eNj18eaL+dmd3hdh9tPFQpka9vlqbDmdaXRoAwAMQZuA2Lm8XpU8f6qfEyGAdzy7ULW+v1ZxvD1tdFgDAzRFm4FZaNA7WJxP66aqO0SouNXryk+367cfbVFhSanVpAAA3RZiB2wm2++jNsZfp8RFtZLNJ/1qfrFveXqfUU8yjAQCcjTADt2Sz2fTg4FaaPb6HwgJ8tSU5S9e8vlobDp60ujQAgJshzMCtDW4TqYUP9Vfb6BCdOF2oW2et03trD6qOL48EALgIhBm4vWbhgfr4wb4alRSrkjKj33+6Q499uE0FxcyjAQAQZuAhAv189JcxXfS7ke3kZZM+2nREN721Vkez8q0uDQBgMcIMPIbNZtP/Dmihf/5PLzUM9NX2o6c06vXVWrPvhNWlAQAsRJiBx+nXKkILJ/ZXh9hQncwt0h1/X6+/rdrPPBoAqKcIM/BITRsG6qMH+mp01yYqLTP6w+c/aPK8LcovYh4NANQ3hBl4LH9fb/3p5iQ9M6q9vL1s+nRLim6cuUbJJ/OsLg0AUIsIM/BoNptN4/sl6IP/7aWIYD/tPJatUTNWa9WedKtLAwDUEsIM6oTeLcK1cGJ/JcU1UFZesca9s15vrdjHPBoAqAcIM6gzYsICNO/e3rq5e1OVGenFRT/qoTmblVtYYnVpAIAaRJhBneLv662XbuysP1zfUb7eNn2+/ZhGv7lGB0/kWl0aAKCGEGZQ59hsNt3eO15z7+2txiF27Tqeo2tnrNayH9OsLg0AUAMIM6izusU30mcT++uyZg2UXVCiu9/doBlf71FZGfNoAKAuIcygTosK9dfce/tobK9mMkZ6Zclu3f/+d8opKLa6NABANSHMoM7z8/HSCzd00ks3dpKft5eW7Dyu69/4RvvST1tdGgCgGhBmUG/c0qOZ/n1/H0WH+mtfeq6um/GNlu48bnVZAIBLRJhBvdIlroEWTuyvngmNdLqwRPe8t1GP/nur0nMKrS4NAFBFhBnUO41D7Prgf3vp7n4JkqSPNh3R0D8t17trDqqktMzi6gAAF4swg3rJ19tLvx/VXh8/2Fcdm4Qqp6BETy/YoVEzvtHGgyetLg8AcBFspo6v956dna2wsDCdOnVKoaGhVpcDN1RaZvSv9Yf1xy926VR+eZfTjZc11RNXtVXjELvF1QFA/XQxf7+5M4N6z9urfJG9ZY8N1pgecZIYegIAT8KdGeBnNh3O1O8//V7fH82WJLWLCdXz13VQ9+aNLK4MAOqPi/n7TZgBzoGhJwCwFsNMwCVi6AkAPAd3ZoBKYOgJAGoXw0xOCDOoLgw9AUDtYZgJqAEVQ09fPzpIt3Rn6AkA3AV3ZoAqYugJAGoOw0xOCDOoSQw9AUDNYJgJqCUXGnr6xzcHGHoCgFrAnRmgGjH0BADVg2EmJ4QZ1DaGngDg0jHMBFiIoScAqF3cmQFqGENPAHDxGGZyQpiBOygtM5qz/rBeYegJACqFYSbAzXh72XQHQ08AUCO4MwNYgKEnALgwhpmcEGbgrhh6AoDzY5gJ8AAXGnr668r9yisqsbhCAPAM3JkB3MTPh54aBfnpf/on6I4+8Qr197W4OgCoXQwzOSHMwJOUlhn957tkvbl8nw5l5EmSQvx9NK5Pc93dP0GNgvwsrhAAagdhxglhBp6opLRMn207pjeW7dWetNOSpABfb43t1Uz3DGyhqFB/iysEgJpFmHFCmIEnKyszWrLzuGYs2+MYfvLz9tJN3Zvq/kEtFdco0OIKAaBmEGacEGZQFxhjtGJ3ut5YtlcbDmZKKp9AfH2XJnpwSEu1bBxscYUAUL0IM04IM6hrvt2foRnL9mrVnhOSJJtNurpjjCYMaaX2sfw7DqBuIMw4IcygrtqanKUZy/Zq6c7jjmOXt43UhKGtdFmzhhZWBgCXjjDjhDCDuu7H1Gy9sWyfPt+WorIz/2/u2zJcDw1tpT4twmWz2awtEACqgDDjhDCD+uLAiVzNXL5XH286qpIzqeayZg300NBWGtImklADwKMQZpwQZlDfHMnM06yV+zV3Q7KKSso3sOwQG6oJQ1ppRIdoeXkRagC4P8KME8IM6qu0nAL9fdUB/XPdIeUVlUqSWjYO0oODW+naLrHy9WY3EwDuy2P2Zpo+fbp69OihkJAQRUZG6vrrr9euXbtcrhk/frxsNpvLo3fv3hZVDHiOyBB//fbqdvpmylA9fHmiQv19tC89V49+uFVD/7RcH3x7SIUlpVaXCQCXzNI7MyNGjNCYMWPUo0cPlZSUaOrUqdq+fbt27typoKAgSeVh5vjx45o9e7bjeX5+fmrUqFGl3oM7M0C5nIJivb/usP62ar8ycoskSVGhdt0zoIVu69VMgX4+FlcIAD/x2GGm9PR0RUZGasWKFRo4cKCk8jCTlZWl+fPnV+k1CTOAq/yiUs3dcFizVu7XsVMFktjUEoD78Zhhpp87deqUJJ1112X58uWKjIxU69atdc899ygtLc2K8oA6IcDPW3f1S9Dy3wzWi6M7qVmjQJ3MLdIfv9ilfi9+rT8t2aWTZ+7cAIAncJs7M8YYXXfddcrMzNSqVascx+fNm6fg4GDFx8frwIEDeuqpp1RSUqLvvvtOdrv9rNcpLCxUYWGh4+vs7GzFxcVxZwY4Dza1BOCOPHKYacKECfr888+1evVqNW3a9LzXHTt2TPHx8Zo7d65Gjx591vlnnnlGzz777FnHCTPAhZVvapmqGcv2sqklAMt5XJiZOHGi5s+fr5UrVyohIeEXr09MTNT//u//asqUKWed484McGkutKnl+L7N1bFJKAvwAahxFxNmLG1fMMZo4sSJ+uSTT7R8+fJKBZmMjAwlJycrJibmnOftdvs5h58AVI7NZtPgNpEa3CbSZVPLjzYd0Uebjqh9TKjG9IzTdV2aKCyAycIArGfpnZkHH3xQc+bM0aeffqo2bdo4joeFhSkgIECnT5/WM888oxtvvFExMTE6ePCgnnzySR0+fFg//PCDQkJCfvE96GYCLt2W5Cy9s/qAFu9IdawqbPfx0shOMbqlR5x6JjTibg2AauUxw0zn++U3e/ZsjR8/Xvn5+br++uu1efNmZWVlKSYmRkOGDNHzzz+vuLi4Sr0HYQaoPll5Rfpk81HNXZ+sXcdzHMdbRATp5h5xuvGypmocwp1RAJfOY8JMbSDMANXPGKOtR05p3obDWrAlRblntkvw8bLp8naRGtOzmQYmNpY3+0ABqCLCjBPCDFCzcgtL9Nm2FM3dkKzNh7Mcx2PC/HVT9zjd3L2pmjakEwrAxSHMOCHMALVnV2qO5m1I1sebjygrr1iSZLNJ/VtFaEyPZhrWPkp+Pm61VicAN0WYcUKYAWpfQXGplu48rrkbDuubvRmO442C/HTjZU10S484tYr85Qn8AOovwowTwgxgrcMZefr3xmR9+F2yjmf/tAZU9/iGuqVHnEZ2jmGTSwBnIcw4IcwA7qGktEwrdqfrX+uTtWxXmkrLyn/1hNh9dG2XWI3p0YwF+QA4EGacEGYA93M8u0D/+e6I5m1I1uGTeY7jjgX5kpooLJAF+YD6jDDjhDADuK+yMqN1BzI0d32yFn+fqqLSnxbku7pTjMawIB9QbxFmnBBmAM+QmVuk+VvOXpAvISJIt7AgH1DvEGacEGYAz/KLC/L1aKaBrVmQD6jrCDNOCDOA5/qlBflu6tZUcY1YkA+oiwgzTggzQN1wvgX5+rWM0LVJsbqyQzSThoE6hDDjhDAD1C0FxaVasvO45v1sQT5fb5sGtY7UqKQYDWsfxdo1gIcjzDghzAB11+GMPC3clqIFW1JcJg0H+Hrr8naRujYpVoPaNJbdx9vCKgFUBWHGCWEGqB92peZo4dYULdyWokMZP61dE+Lvoys7ROvapFj1bRkuH2/2hgI8AWHGCWEGqF+MMdp25JQWbk3RZ9uOKTW7wHEuPMhPV3eK0aikWHWPbygvOqIAt0WYcUKYAeqvsjKjDQdPauG2FP13e6pO5hY5zsWE+euazuXBplOTMBbmA9wMYcYJYQaAJBWXlmnNvgwt2JKiJTtSlVNY4jjXPDxQo5JidW1SrBKj2M0bcAeEGSeEGQA/V1BcquW70rVwW4q++uG4CorLHOfaRodoVFKsRnWOVbNw1rABrEKYcUKYAXAhuYUl+vKH41q4NUUrdqeruPSnX4ld4hpoVFKsrukco6hQfwurBOofwowTwgyAysrKK9IXO1K1YGuK1u7LUNmZ3442m9QroZGuTWqiqzpGq2GQn7WFAvUAYcYJYQZAVaTlFOi/245p4bZj+u5QpuO4j5dNAxIjNCopVsPaRynEn1WHgZpAmHFCmAFwqZJP5unz7ce0YEuKdh7Ldhy3+3hpaNtIjUqK1dC2kfL3ZXE+oLoQZpwQZgBUp71pp8sX59uaov0nch3Hg/y8NfzM4nz9EyPky+J8wCUhzDghzACoCcYY7TyWrQVbU/TZ1mM6mpXvONcg0FdXdYzWqKRY9UoIlzeL8wEXjTDjhDADoKaVlRltTs7Uwq3H9Nm2YzpxutBxLjLErpGdY3RtUqy6xDVgcT6gkggzTggzAGpTSWmZvj1wUgu2pGjR98eUXfDT4nxxjQI0qnOsru0Sq7bR/D4CLoQw44QwA8AqRSVlWrm7fHG+pTuPK6+o1HGudVSwRnWO1aikWDWPCLKwSsA9EWacEGYAuIO8ohJ99UOaFm5N0fJd6Soq/WnV4c5Nw3RtUqxGdo5RTFiAhVUC7oMw44QwA8DdnMov1pIzi/Ot2Zeh0jOr89lsUo/mjXRtUqyu6hit8GC7xZUC1iHMOCHMAHBnJ04XatH2Y1qwNUUbDv60OJ+3l039W5UvzndlBxbnQ/1DmHFCmAHgKVKy8vXZthQt2Jqi74/+tDifn4+XhrYpX5zv8nYszof6gTDjhDADwBPtTz+thVuPacHWo9qXfvbifKOSYtS/VWP5+bA4H+omwowTwgwAT2aM0Q/HcrRwW/mqw0cyWZwP9QNhxglhBkBdYYzRpsNZWrg1RZ9vP6b0HBbnQ91FmHFCmAFQF5WWGX27P0MLtqZo0fepOpVf7DjnvDhfm6gQgg08EmHGCWEGQF1XVFKmVXvStWDr2YvzJUYG69okFueD5yHMOCHMAKhP8otK9dWPx7Vwa4qW/ei6OF+nJmG6skOUhneIVmJkMHds4NYIM04IMwDqq+yCYn3xfaoWbjumb/aecCzOJ0kJEUEa3j5KwztEqWtcQ3kxeRhuhjDjhDADAFLG6UIt3XlcS3Ye1+q9J1RU8tMdm4hgu4a1j9Tw9tHq0zKcdWzgFggzTggzAODqdGGJVu5O15IdqfrqxzTlOO3sHeTnrcFtIjW8Q5QGt4lUWAArD8MahBknhBkAOL+ikjKtP3BSX+xI1dKdx5WaXeA45+NlU5+W4RreIVrD2kUpOszfwkpR3xBmnBBmAKByysqMth89pSU7U7Vkx3HtSTvtcj4proGGt4/SlR2i1LIxE4hRswgzTggzAFA1+9NPO+bZbDqcKee/Fi0igjSsQ5SGt49W17gGTCBGtSPMOCHMAMClS8sp0Jc707RkZ6rW7M1wafluHGLXFe3KO6P6tgyX3YcJxLh0hBknhBkAqF45BcVasTtdS3Yc17If05RT+NME4mC7jwa3aazhHaI1uE1jhfozgRhVQ5hxQpgBgJpTVFKmdfszHPNs0pz2i/L1tqlPywgNbx+lYe2jFBXKBGJUHmHGCWEGAGpHWZnR1iNZWrLzuJbsSNW+9FyX813iGujKDtEafmYCMXAhhBknhBkAsMbetIoJxKnafDjL5VzLxkEa3iFaw9tHKakpE4hxNsKME8IMAFgvLbtAS384riU7jmvNvhMqLv3pT0+jID/1bxWh/okRGpAYoZiwAAsrhbsgzDghzACAe8kuKNaKXen6Ykeqlu9K12mnCcSS1CoyWAPOBJteCeEKsvtYVCmsRJhxQpgBAPdVXFqmzYeztGpPulbtOaFtR7LktB+mfL1tuqxZQw1s3Vj9W0WoY5MweTMkVS8QZpwQZgDAc5zKK9aafSe0cs8JrdqTriOZ+S7nGwT6ql+rCA04MyzVtGGgRZWiphFmnBBmAMAzGWN0KCNPq/ae0Krd6Vq7L8NlTRupfCXiAYkR6p/YWL1bNFII69rUGYQZJ4QZAKgbSkrLtPVIllbtOaFVe05oS3KWSp3GpHy8bOrarIEGJDZW/8QIdW4SJh9vLwsrxqUgzDghzABA3ZRdUKy1+zK0ak+6Vu85oYMZeS7nQ/191LdlhAa0jtCAVo3VLJwhKU9CmHFCmAGA+iH5ZN6Zuzbp+mbvCWUXuA5JxYcHlg9JtWqsPi3DFRbAkJQ7I8w4IcwAQP1TWma07UiWVp8Zktp0OFMlTkNS3l42JTUN04DExhqQGKEucQ0YknIzHhNmpk+fro8//lg//vijAgIC1LdvX7300ktq06aN4xpjjJ599lnNmjVLmZmZ6tWrl9544w116NChUu9BmAEAnC4s0bp9GVq994RW7knX/p9ttRBi91HvluEaeGYycfPwQNlstIBbyWPCzIgRIzRmzBj16NFDJSUlmjp1qrZv366dO3cqKChIkvTSSy/phRde0D/+8Q+1bt1af/jDH7Ry5Urt2rVLISEhv/gehBkAwM8dzcrX6jNr23yz94Qy84pdzjdtGKABiY01MDFCfVtGKCyQIana5jFh5ufS09MVGRmpFStWaODAgTLGKDY2VpMnT9aUKVMkSYWFhYqKitJLL72k++677xdfkzADALiQsjKjHSnZWnlmIvHGQyddtlvwskmdmzbQwMQIDWjdWF3iGsiXIaka57FhZu/evUpMTNT27dvVsWNH7d+/Xy1bttSmTZvUtWtXx3XXXXedGjRooHffffcXX5MwAwC4GHlFJfp2/0nHZOI9aaddzgfbfdSHIakadzF/v91mwwtjjB555BH1799fHTt2lCSlpqZKkqKiolyujYqK0qFDh875OoWFhSosLHR8nZ2dXUMVAwDqokA/Hw1pG6khbSMlScdO5TvWtlm9J12ZecVauvO4lu48LokhKXfgNmHmoYce0rZt27R69eqzzv088RpjzpuCp0+frmeffbZGagQA1D8xYQG6uXucbu4e5xiSWrU3Xat2lw9JHcnM17/WH9a/1h9mSMoibjHMNHHiRM2fP18rV65UQkKC43hVhpnOdWcmLi6OYSYAQLWrGJJaeWYy8V6GpKqNxwwzGWM0ceJEffLJJ1q+fLlLkJGkhIQERUdHa+nSpY4wU1RUpBUrVuill14652va7XbZ7fYarx0AAIak3IOld2YefPBBzZkzR59++qnL2jJhYWEKCAiQVN6aPX36dM2ePVuJiYmaNm2ali9fTms2AMCtVaZLKimugQa0YkjqXDymm+l8t9pmz56t8ePHS/pp0by3337bZdG8iknCv4QwAwBwB7mFJVp/gCGpyvKYMFMbCDMAAHeUkpVfvt3C3p+GpJzV9yEpwowTwgwAwN05D0mt2pOu7w5lnnNIql/LCPVpGa5u8Q3l7+ttYcU1jzDjhDADAPA0uYUl+vZAhmMy8c+HpPy8vdSlWQP1aRGu3i3C1bVZgzoXbggzTggzAABPVzEktWbfCa3dn6Hj2YUu5+0+XrqsWUP1aRmuPi3DldS0gfx8PHsyMWHGCWEGAFCXGGN0MCNPa/dlaO3+DK3dl6ETp13Djb+vl7rHN1KfluV3bjo3DfO4TinCjBPCDACgLjPGaF96rtbuz9C6fRlatz9DGblFLtcE+nmre/NG6tOi/M5Nx9hQ+bh5uCHMOCHMAADqE2OM9qSdLr9zsy9D6w5kKOtnnVLBdh/1aH5mWKpFhNrHhsrby73awAkzTggzAID6rKzMaNfxHMew1Lf7M5RdUOJyTai/j3omhKt3i/KhqXbRofKyONwQZpwQZgAA+ElpmdEPx7K17sx8m/UHTiqn0DXcNAj0Va+ERup9ZliqdWRIrYcbwowTwgwAAOdXUlqmHSlnws3+DG04cFK5RaUu1zQK8iu/a3OmFbxVZHCNr05MmHFCmAEAoPKKS8u0/eip8vk2+zO08WCm8otdw01EsN0xJNW7RbhaRARVe7ghzDghzAAAUHVFJWXadiTLMZl448FMFZaUuVwzpkecXryxc7W+78X8/fap1ncGAAB1ip+Pl7o3b6TuzRtpohJVWFKqLYezylvB92do0+EsdWgSZmmNhBkAAFBpdh9v9WoRrl4twiVJBcWlKrN4kIcwAwAAqswd9oRy7+X/AAAAfgFhBgAAeDTCDAAA8GiEGQAA4NEIMwAAwKMRZgAAgEcjzAAAAI9GmAEAAB6NMAMAADwaYQYAAHg0wgwAAPBohBkAAODRCDMAAMCj1flds82Zbcmzs7MtrgQAAFRWxd/tir/jF1Lnw0xOTo4kKS4uzuJKAADAxcrJyVFYWNgFr7GZykQeD1ZWVqaUlBSFhITIZrNV62tnZ2crLi5OycnJCg0NrdbXrmv4rCqPz6ry+Kwqj8+q8visKq8mPytjjHJychQbGysvrwvPiqnzd2a8vLzUtGnTGn2P0NBQ/oWvJD6ryuOzqjw+q8rjs6o8PqvKq6nP6pfuyFRgAjAAAPBohBkAAODRCDOXwG636+mnn5bdbre6FLfHZ1V5fFaVx2dVeXxWlcdnVXnu8lnV+QnAAACgbuPODAAA8GiEGQAA4NEIMwAAwKMRZgAAgEcjzFTRm2++qYSEBPn7+6tbt25atWqV1SW5nenTp6tHjx4KCQlRZGSkrr/+eu3atcvqsjzC9OnTZbPZNHnyZKtLcVtHjx7V7bffrvDwcAUGBqpLly767rvvrC7L7ZSUlOh3v/udEhISFBAQoBYtWui5555TWVmZ1aVZbuXKlRo1apRiY2Nls9k0f/58l/PGGD3zzDOKjY1VQECABg8erB07dlhTrMUu9FkVFxdrypQp6tSpk4KCghQbG6s777xTKSkptVYfYaYK5s2bp8mTJ2vq1KnavHmzBgwYoKuuukqHDx+2ujS3smLFCk2YMEHr1q3T0qVLVVJSouHDhys3N9fq0tzahg0bNGvWLHXu3NnqUtxWZmam+vXrJ19fXy1atEg7d+7Un/70JzVo0MDq0tzOSy+9pLfeekszZszQDz/8oJdffll//OMf9frrr1tdmuVyc3OVlJSkGTNmnPP8yy+/rFdffVUzZszQhg0bFB0drWHDhjn2/KtPLvRZ5eXladOmTXrqqae0adMmffzxx9q9e7euvfba2ivQ4KL17NnT3H///S7H2rZta5544gmLKvIMaWlpRpJZsWKF1aW4rZycHJOYmGiWLl1qBg0aZCZNmmR1SW5pypQppn///laX4RFGjhxp7r77bpdjo0ePNrfffrtFFbknSeaTTz5xfF1WVmaio6PNiy++6DhWUFBgwsLCzFtvvWVBhe7j55/Vuaxfv95IMocOHaqVmrgzc5GKior03Xffafjw4S7Hhw8frjVr1lhUlWc4deqUJKlRo0YWV+K+JkyYoJEjR+qKK66wuhS3tmDBAnXv3l033XSTIiMj1bVrV/31r3+1uiy31L9/f3311VfavXu3JGnr1q1avXq1rr76aosrc28HDhxQamqqy+96u92uQYMG8bu+Ek6dOiWbzVZrd0vr/EaT1e3EiRMqLS1VVFSUy/GoqCilpqZaVJX7M8bokUceUf/+/dWxY0ery3FLc+fO1aZNm7RhwwarS3F7+/fv18yZM/XII4/oySef1Pr16/Xwww/LbrfrzjvvtLo8tzJlyhSdOnVKbdu2lbe3t0pLS/XCCy/o1ltvtbo0t1bx+/xcv+sPHTpkRUkeo6CgQE888YRuu+22WtuokzBTRTabzeVrY8xZx/CThx56SNu2bdPq1autLsUtJScna9KkSVqyZIn8/f2tLsftlZWVqXv37po2bZokqWvXrtqxY4dmzpxJmPmZefPm6f3339ecOXPUoUMHbdmyRZMnT1ZsbKzGjRtndXluj9/1F6e4uFhjxoxRWVmZ3nzzzVp7X8LMRYqIiJC3t/dZd2HS0tLOSvAoN3HiRC1YsEArV65U06ZNrS7HLX333XdKS0tTt27dHMdKS0u1cuVKzZgxQ4WFhfL29rawQvcSExOj9u3buxxr166dPvroI4sqcl+/+c1v9MQTT2jMmDGSpE6dOunQoUOaPn06YeYCoqOjJZXfoYmJiXEc53f9+RUXF+vmm2/WgQMH9PXXX9faXRmJbqaL5ufnp27dumnp0qUux5cuXaq+fftaVJV7MsbooYce0scff6yvv/5aCQkJVpfkti6//HJt375dW7ZscTy6d++usWPHasuWLQSZn+nXr99Zbf67d+9WfHy8RRW5r7y8PHl5uf6q9/b2pjX7FyQkJCg6Otrld31RUZFWrFjB7/pzqAgye/bs0Zdffqnw8PBafX/uzFTBI488ojvuuEPdu3dXnz59NGvWLB0+fFj333+/1aW5lQkTJmjOnDn69NNPFRIS4ribFRYWpoCAAIurcy8hISFnzSUKCgpSeHg4c4zO4de//rX69u2radOm6eabb9b69es1a9YszZo1y+rS3M6oUaP0wgsvqFmzZurQoYM2b96sV199VXfffbfVpVnu9OnT2rt3r+PrAwcOaMuWLWrUqJGaNWumyZMna9q0aUpMTFRiYqKmTZumwMBA3XbbbRZWbY0LfVaxsbH61a9+pU2bNumzzz5TaWmp4/d9o0aN5OfnV/MF1krPVB30xhtvmPj4eOPn52cuu+wy2o3PQdI5H7Nnz7a6NI9Aa/aFLVy40HTs2NHY7XbTtm1bM2vWLKtLckvZ2dlm0qRJplmzZsbf39+0aNHCTJ061RQWFlpdmuWWLVt2zt9R48aNM8aUt2c//fTTJjo62tjtdjNw4ECzfft2a4u2yIU+qwMHDpz39/2yZctqpT6bMcbUfGQCAACoGcyZAQAAHo0wAwAAPBphBgAAeDTCDAAA8GiEGQAA4NEIMwAAwKMRZgAAgEcjzACod2w2m+bPn291GQCqCWEGQK0aP368bDbbWY8RI0ZYXRoAD8XeTABq3YgRIzR79myXY3a73aJqAHg67swAqHV2u13R0dEuj4YNG0oqHwKaOXOmrrrqKgUEBCghIUEffvihy/O3b9+uoUOHKiAgQOHh4br33nt1+vRpl2veeecddejQQXa7XTExMXrooYdczp84cUI33HCDAgMDlZiYqAULFtTsNw2gxhBmALidp556SjfeeKO2bt2q22+/Xbfeeqt++OEHSVJeXp5GjBihhg0basOGDfrwww/15ZdfuoSVmTNnasKECbr33nu1fft2LViwQK1atXJ5j2effVY333yztm3bpquvvlpjx47VyZMna/X7BFBNamU7SwA4Y9y4ccbb29sEBQW5PJ577jljTPlu6/fff7/Lc3r16mUeeOABY4wxs2bNMg0bNjSnT592nP/888+Nl5eXSU1NNcYYExsba6ZOnXreGiSZ3/3ud46vT58+bWw2m1m0aFG1fZ8Aag9zZgDUuiFDhmjmzJkuxxo1auT45z59+ric69Onj7Zs2SJJ+uGHH5SUlKSgoCDH+X79+qmsrEy7du2SzWZTSkqKLr/88gvW0LlzZ8c/BwUFKSQkRGlpaVX9lgBYiDADoNYFBQWdNezzS2w2myTJGOP453NdExAQUKnX8/X1Peu5ZWVlF1UTAPfAnBkAbmfdunVnfd22bVtJUvv27bVlyxbl5uY6zn/zzTfy8vJS69atFRISoubNm+urr76q1ZoBWIc7MwBqXWFhoVJTU12O+fj4KCIiQpL04Ycfqnv37urfv78++OADrV+/Xn//+98lSWPHjtXTTz+tcePG6ZlnnlF6eromTpyoO+64Q1FRUZKkZ555Rvfff78iIyN11VVXKScnR998840mTpxYu98ogFpBmAFQ6xYvXqyYmBiXY23atNGPP/4oqbzTaO7cuXrwwQcVHR2tDz74QO3bt5ckBQYG6osvvtCkSZPUo0cPBQYG6sYbb9Srr77qeK1x48apoKBA//d//6fHHntMERER+tWvflV73yCAWmUzxhiriwCACjabTZ988omuv/56q0sB4CGYMwMAADwaYQYAAHg05swAcCuMfAO4WNyZAQAAHo0wAwAAPBphBgAAeDTCDAAA8GiEGQAA4NEIMwAAwKMRZgAAgEcjzAAAAI9GmAEAAB7t/wFQm6Qw92JT5AAAAABJRU5ErkJggg==", "text/plain": [ "<Figure size 640x480 with 1 Axes>" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "import matplotlib.pyplot as plt\n", "\n", "n_epochs_overfit = 13 #Otherwise len(train_lost_list) < n_epochs\n", "plt.plot(range(n_epochs_overfit), train_loss_list)\n", "plt.xlabel(\"Epoch\")\n", "plt.ylabel(\"Loss\")\n", "plt.title(\"Performance of Model 1\")\n", "plt.show()" ] }, { "cell_type": "markdown", "id": "11df8fd4", "metadata": {}, "source": [ "Now loading the model with the lowest validation loss value\n" ] }, { "cell_type": "code", "execution_count": 10, "id": "e93efdfc", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Test Loss: 22.235297\n", "\n", "Test Accuracy of airplane: 52% (523/1000)\n", "Test Accuracy of automobile: 84% (849/1000)\n", "Test Accuracy of bird: 34% (341/1000)\n", "Test Accuracy of cat: 43% (432/1000)\n", "Test Accuracy of deer: 66% (662/1000)\n", "Test Accuracy of dog: 44% (448/1000)\n", "Test Accuracy of frog: 74% (746/1000)\n", "Test Accuracy of horse: 64% (647/1000)\n", "Test Accuracy of ship: 83% (836/1000)\n", "Test Accuracy of truck: 64% (649/1000)\n", "\n", "Test Accuracy (Overall): 61% (6133/10000)\n" ] } ], "source": [ "model.load_state_dict(torch.load(\"./model_cifar.pt\"))\n", "\n", "# track test loss\n", "test_loss = 0.0\n", "class_correct = list(0.0 for i in range(10))\n", "class_total = list(0.0 for i in range(10))\n", "\n", "model.eval()\n", "# iterate over test data\n", "for data, target in test_loader:\n", " # move tensors to GPU if CUDA is available\n", " if train_on_gpu:\n", " data, target = data.cuda(), target.cuda()\n", " # forward pass: compute predicted outputs by passing inputs to the model\n", " output = model(data)\n", " # calculate the batch loss\n", " loss = criterion(output, target)\n", " # update test loss\n", " test_loss += loss.item() * data.size(0)\n", " # convert output probabilities to predicted class\n", " _, pred = torch.max(output, 1)\n", " # compare predictions to true label\n", " correct_tensor = pred.eq(target.data.view_as(pred))\n", " correct = (\n", " np.squeeze(correct_tensor.numpy())\n", " if not train_on_gpu\n", " else np.squeeze(correct_tensor.cpu().numpy())\n", " )\n", " # calculate test accuracy for each object class\n", " for i in range(batch_size):\n", " label = target.data[i]\n", " class_correct[label] += correct[i].item()\n", " class_total[label] += 1\n", "\n", "# average test loss\n", "test_loss = test_loss / len(test_loader)\n", "print(\"Test Loss: {:.6f}\\n\".format(test_loss))\n", "\n", "for i in range(10):\n", " if class_total[i] > 0:\n", " print(\n", " \"Test Accuracy of %5s: %2d%% (%2d/%2d)\"\n", " % (\n", " classes[i],\n", " 100 * class_correct[i] / class_total[i],\n", " np.sum(class_correct[i]),\n", " np.sum(class_total[i]),\n", " )\n", " )\n", " else:\n", " print(\"Test Accuracy of %5s: N/A (no training examples)\" % (classes[i]))\n", "\n", "print(\n", " \"\\nTest Accuracy (Overall): %2d%% (%2d/%2d)\"\n", " % (\n", " 100.0 * np.sum(class_correct) / np.sum(class_total),\n", " np.sum(class_correct),\n", " np.sum(class_total),\n", " )\n", ")" ] }, { "cell_type": "markdown", "id": "944991a2", "metadata": {}, "source": [ "Build a new network with the following structure.\n", "\n", "- It has 3 convolutional layers of kernel size 3 and padding of 1.\n", "- The first convolutional layer must output 16 channels, the second 32 and the third 64.\n", "- At each convolutional layer output, we apply a ReLU activation then a MaxPool with kernel size of 2.\n", "- Then, three fully connected layers, the first two being followed by a ReLU activation and a dropout whose value you will suggest.\n", "- The first fully connected layer will have an output size of 512.\n", "- The second fully connected layer will have an output size of 64.\n", "\n", "Compare the results obtained with this new network to those obtained previously." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "CNN definition following the structure required above" ] }, { "cell_type": "code", "execution_count": 11, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Net_1(\n", " (conv1): Conv2d(3, 16, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", " (conv2): Conv2d(16, 32, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", " (conv3): Conv2d(32, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", " (pool): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)\n", " (fc1): Linear(in_features=1024, out_features=512, bias=True)\n", " (fc2): Linear(in_features=512, out_features=64, bias=True)\n", " (fc3): Linear(in_features=64, out_features=10, bias=True)\n", " (dropout): Dropout(p=0.5, inplace=False)\n", ")\n" ] } ], "source": [ "import torch.nn as nn\n", "import torch.nn.functional as F\n", "\n", "# define the CNN architecture\n", "\n", "\n", "class Net_1(nn.Module):\n", " def __init__(self):\n", " super(Net_1, self).__init__()\n", "\n", " #Define the 3 convolutional layers\n", " #First layer : 3 input channels, 16 output channels, kernel size 3, padding 1\n", " self.conv1 = nn.Conv2d(in_channels=3, out_channels=16, kernel_size=3, padding=1)\n", " #Second layer : 16 input channels, 32 output channels, kernel size 3, padding 1\n", " self.conv2 = nn.Conv2d(in_channels=16, out_channels=32, kernel_size=3, padding=1)\n", " #Third layer : 32 input channels, 64 output channels, kernel size 3, padding 1\n", " self.conv3 = nn.Conv2d(in_channels=32, out_channels=64, kernel_size=3, padding=1)\n", "\n", " #MaxPool with kernel size 2\n", " self.pool = nn.MaxPool2d(2, 2)\n", "\n", " #Define the 3 fully connected layers\n", " #First layer : input of size 64, image dimension 4*4, output of size 512\n", " self.fc1 = nn.Linear(in_features=64 * 4 * 4,out_features=512)\n", " #Second layer : input of size 512, output of size 64\n", " self.fc2 = nn.Linear(512, 64)\n", " #Third layer : input of size 64, output of size 10\n", " self.fc3 = nn.Linear(64, 10)\n", "\n", " #Dropout\n", " self.dropout = nn.Dropout(p=0.5)\n", "\n", "\n", " def forward(self, x):\n", "\n", " #Through the 3 convolutional layers\n", " x = self.pool(F.relu(self.conv1(x)))\n", " x = self.pool(F.relu(self.conv2(x)))\n", " x = self.pool(F.relu(self.conv3(x)))\n", "\n", " #Linearize the output\n", " x = x.view(-1, 64 * 4 * 4) \n", "\n", " #Through the 3 fully connected layers\n", " x = self.dropout(F.relu(self.fc1(x)))\n", " x = self.dropout(F.relu(self.fc2(x)))\n", " x = self.fc3(x)\n", "\n", " return x\n", "\n", "\n", "# create a complete CNN\n", "model_1 = Net_1()\n", "print(model_1)\n", "# move tensors to GPU if CUDA is available\n", "if train_on_gpu:\n", " model.cuda()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Loss function and training using SGD (Stochastic Gradient Descent) optimizer" ] }, { "cell_type": "code", "execution_count": 12, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Epoch: 0 \tTraining Loss: 45.348058 \tValidation Loss: 41.718214\n", "Validation loss decreased (inf --> 41.718214). Saving model_1 ...\n", "Epoch: 1 \tTraining Loss: 39.649087 \tValidation Loss: 35.754235\n", "Validation loss decreased (41.718214 --> 35.754235). Saving model_1 ...\n", "Epoch: 2 \tTraining Loss: 35.008029 \tValidation Loss: 31.420939\n", "Validation loss decreased (35.754235 --> 31.420939). Saving model_1 ...\n", "Epoch: 3 \tTraining Loss: 32.138094 \tValidation Loss: 28.863286\n", "Validation loss decreased (31.420939 --> 28.863286). Saving model_1 ...\n", "Epoch: 4 \tTraining Loss: 30.218731 \tValidation Loss: 28.003921\n", "Validation loss decreased (28.863286 --> 28.003921). Saving model_1 ...\n", "Epoch: 5 \tTraining Loss: 28.807953 \tValidation Loss: 26.228902\n", "Validation loss decreased (28.003921 --> 26.228902). Saving model_1 ...\n", "Epoch: 6 \tTraining Loss: 27.365782 \tValidation Loss: 25.497843\n", "Validation loss decreased (26.228902 --> 25.497843). Saving model_1 ...\n", "Epoch: 7 \tTraining Loss: 26.038266 \tValidation Loss: 23.508494\n", "Validation loss decreased (25.497843 --> 23.508494). Saving model_1 ...\n", "Epoch: 8 \tTraining Loss: 24.863525 \tValidation Loss: 23.421283\n", "Validation loss decreased (23.508494 --> 23.421283). Saving model_1 ...\n", "Epoch: 9 \tTraining Loss: 23.610995 \tValidation Loss: 21.928674\n", "Validation loss decreased (23.421283 --> 21.928674). Saving model_1 ...\n", "Epoch: 10 \tTraining Loss: 22.689530 \tValidation Loss: 21.890606\n", "Validation loss decreased (21.928674 --> 21.890606). Saving model_1 ...\n", "Epoch: 11 \tTraining Loss: 21.605674 \tValidation Loss: 20.122198\n", "Validation loss decreased (21.890606 --> 20.122198). Saving model_1 ...\n", "Epoch: 12 \tTraining Loss: 20.795100 \tValidation Loss: 20.151628\n" ] }, { "ename": "KeyboardInterrupt", "evalue": "", "output_type": "error", "traceback": [ "\u001b[1;31m---------------------------------------------------------------------------\u001b[0m", "\u001b[1;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", "\u001b[1;32md:\\Users\\lucil\\Documents\\S9\\Apprentissage profond\\mod_4_6-td2\\TD2 Deep Learning.ipynb Cell 24\u001b[0m line \u001b[0;36m2\n\u001b[0;32m <a href='vscode-notebook-cell:/d%3A/Users/lucil/Documents/S9/Apprentissage%20profond/mod_4_6-td2/TD2%20Deep%20Learning.ipynb#X56sZmlsZQ%3D%3D?line=25'>26</a>\u001b[0m loss \u001b[39m=\u001b[39m criterion(output, target)\n\u001b[0;32m <a href='vscode-notebook-cell:/d%3A/Users/lucil/Documents/S9/Apprentissage%20profond/mod_4_6-td2/TD2%20Deep%20Learning.ipynb#X56sZmlsZQ%3D%3D?line=26'>27</a>\u001b[0m \u001b[39m# Backward pass: compute gradient of the loss with respect to model parameters\u001b[39;00m\n\u001b[1;32m---> <a href='vscode-notebook-cell:/d%3A/Users/lucil/Documents/S9/Apprentissage%20profond/mod_4_6-td2/TD2%20Deep%20Learning.ipynb#X56sZmlsZQ%3D%3D?line=27'>28</a>\u001b[0m loss\u001b[39m.\u001b[39mbackward()\n\u001b[0;32m <a href='vscode-notebook-cell:/d%3A/Users/lucil/Documents/S9/Apprentissage%20profond/mod_4_6-td2/TD2%20Deep%20Learning.ipynb#X56sZmlsZQ%3D%3D?line=28'>29</a>\u001b[0m \u001b[39m# Perform a single optimization step (parameter update)\u001b[39;00m\n\u001b[0;32m <a href='vscode-notebook-cell:/d%3A/Users/lucil/Documents/S9/Apprentissage%20profond/mod_4_6-td2/TD2%20Deep%20Learning.ipynb#X56sZmlsZQ%3D%3D?line=29'>30</a>\u001b[0m optimizer\u001b[39m.\u001b[39mstep()\n", "File \u001b[1;32mc:\\Users\\lucil\\anaconda3\\Lib\\site-packages\\torch\\_tensor.py:492\u001b[0m, in \u001b[0;36mTensor.backward\u001b[1;34m(self, gradient, retain_graph, create_graph, inputs)\u001b[0m\n\u001b[0;32m 482\u001b[0m \u001b[39mif\u001b[39;00m has_torch_function_unary(\u001b[39mself\u001b[39m):\n\u001b[0;32m 483\u001b[0m \u001b[39mreturn\u001b[39;00m handle_torch_function(\n\u001b[0;32m 484\u001b[0m Tensor\u001b[39m.\u001b[39mbackward,\n\u001b[0;32m 485\u001b[0m (\u001b[39mself\u001b[39m,),\n\u001b[1;32m (...)\u001b[0m\n\u001b[0;32m 490\u001b[0m inputs\u001b[39m=\u001b[39minputs,\n\u001b[0;32m 491\u001b[0m )\n\u001b[1;32m--> 492\u001b[0m torch\u001b[39m.\u001b[39mautograd\u001b[39m.\u001b[39mbackward(\n\u001b[0;32m 493\u001b[0m \u001b[39mself\u001b[39m, gradient, retain_graph, create_graph, inputs\u001b[39m=\u001b[39minputs\n\u001b[0;32m 494\u001b[0m )\n", "File \u001b[1;32mc:\\Users\\lucil\\anaconda3\\Lib\\site-packages\\torch\\autograd\\__init__.py:251\u001b[0m, in \u001b[0;36mbackward\u001b[1;34m(tensors, grad_tensors, retain_graph, create_graph, grad_variables, inputs)\u001b[0m\n\u001b[0;32m 246\u001b[0m retain_graph \u001b[39m=\u001b[39m create_graph\n\u001b[0;32m 248\u001b[0m \u001b[39m# The reason we repeat the same comment below is that\u001b[39;00m\n\u001b[0;32m 249\u001b[0m \u001b[39m# some Python versions print out the first line of a multi-line function\u001b[39;00m\n\u001b[0;32m 250\u001b[0m \u001b[39m# calls in the traceback and some print out the last line\u001b[39;00m\n\u001b[1;32m--> 251\u001b[0m Variable\u001b[39m.\u001b[39m_execution_engine\u001b[39m.\u001b[39mrun_backward( \u001b[39m# Calls into the C++ engine to run the backward pass\u001b[39;00m\n\u001b[0;32m 252\u001b[0m tensors,\n\u001b[0;32m 253\u001b[0m grad_tensors_,\n\u001b[0;32m 254\u001b[0m retain_graph,\n\u001b[0;32m 255\u001b[0m create_graph,\n\u001b[0;32m 256\u001b[0m inputs,\n\u001b[0;32m 257\u001b[0m allow_unreachable\u001b[39m=\u001b[39m\u001b[39mTrue\u001b[39;00m,\n\u001b[0;32m 258\u001b[0m accumulate_grad\u001b[39m=\u001b[39m\u001b[39mTrue\u001b[39;00m,\n\u001b[0;32m 259\u001b[0m )\n", "\u001b[1;31mKeyboardInterrupt\u001b[0m: " ] } ], "source": [ "import torch.optim as optim\n", "\n", "criterion = nn.CrossEntropyLoss() # specify loss function\n", "optimizer = optim.SGD(model_1.parameters(), lr=0.01) # specify optimizer\n", "\n", "n_epochs_1 = 30 # number of epochs to train the model\n", "train_loss_list_1 = [] # list to store loss to visualize\n", "valid_loss_min_1 = np.Inf # track change in validation loss\n", "\n", "for epoch in range(n_epochs_1):\n", " # Keep track of training and validation loss\n", " train_loss = 0.0\n", " valid_loss = 0.0\n", "\n", " # Train the model\n", " model_1.train()\n", " for data, target in train_loader:\n", " # Move tensors to GPU if CUDA is available\n", " if train_on_gpu:\n", " data, target = data.cuda(), target.cuda()\n", " # Clear the gradients of all optimized variables\n", " optimizer.zero_grad()\n", " # Forward pass: compute predicted outputs by passing inputs to the model\n", " output = model_1(data)\n", " # Calculate the batch loss\n", " loss = criterion(output, target)\n", " # Backward pass: compute gradient of the loss with respect to model parameters\n", " loss.backward()\n", " # Perform a single optimization step (parameter update)\n", " optimizer.step()\n", " # Update training loss\n", " train_loss += loss.item() * data.size(0)\n", "\n", " # Validate the model\n", " model_1.eval()\n", " for data, target in valid_loader:\n", " # Move tensors to GPU if CUDA is available\n", " if train_on_gpu:\n", " data, target = data.cuda(), target.cuda()\n", " # Forward pass: compute predicted outputs by passing inputs to the model\n", " output = model_1(data)\n", " # Calculate the batch loss\n", " loss = criterion(output, target)\n", " # Update average validation loss\n", " valid_loss += loss.item() * data.size(0)\n", "\n", " # Calculate average losses\n", " train_loss = train_loss / len(train_loader)\n", " valid_loss = valid_loss / len(valid_loader)\n", " train_loss_list_1.append(train_loss)\n", "\n", " # Print training/validation statistics\n", " print(\n", " \"Epoch: {} \\tTraining Loss: {:.6f} \\tValidation Loss: {:.6f}\".format(\n", " epoch, train_loss, valid_loss\n", " )\n", " )\n", "\n", " # Save model if validation loss has decreased\n", " if valid_loss <= valid_loss_min_1:\n", " print(\n", " \"Validation loss decreased ({:.6f} --> {:.6f}). Saving model_1 ...\".format(\n", " valid_loss_min_1, valid_loss\n", " )\n", " )\n", " torch.save(model_1.state_dict(), \"model_1_cifar.pt\")\n", " valid_loss_min_1 = valid_loss" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Compare the results with the previous model's results" ] }, { "cell_type": "code", "execution_count": 17, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjMAAAHFCAYAAAAHcXhbAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8pXeV/AAAACXBIWXMAAA9hAAAPYQGoP6dpAAB7PUlEQVR4nO3dd1hV9R8H8Pe9wL3svREZIqAiLhy4cIsry6zcmmU5y6aV21LTypbmLFua5i9X7r1RUUQRVy5EAVGQvbnf3x/E7V5BRQTOvfB+Pc99Hjnn3HM+d7895ztkQggBIiIiIj0ll7oAIiIiomfBMENERER6jWGGiIiI9BrDDBEREek1hhkiIiLSawwzREREpNcYZoiIiEivMcwQERGRXmOYISIiIr3GMFMO586dw6uvvgovLy8YGxvD3NwcTZs2xfz585GcnCx1eZVuxIgR8PT0lLqMZ3bmzBmEhITAysoKMpkM33zzzSO3lclkWjcrKyt06NABW7durfC61q5diwYNGsDExAQymQyRkZEVfgxd5+npiREjRlTY/vLy8jB69Gi4uLjAwMAAjRs3rrB9P6vvv/8ePj4+UCgUkMlkSElJKXW7n3/+Wf3+O3DgQIn1Qgj4+PhAJpOhQ4cOFVqjTCbDjBkznvp+N2/ehEwmw88//1ym7b///nv4+/tDqVTCy8sLM2fORH5+/lMftzKV9fuvQ4cOkMlk8Pb2RmkD7R86dEj9epb1+SmL4vfJzZs3n/q+M2bMgEwme+J2t2/fxsSJExESEgJra+sKfwzlwTDzlJYvX45mzZohPDwcH3zwAXbs2IENGzbgpZdewpIlS/Daa69JXWKlmzp1KjZs2CB1Gc9s5MiRiI+Px5o1axAWFoYBAwY8dvv+/fsjLCwMR48exaJFi5CQkIA+ffpUaKC5d+8ehg4dijp16mDHjh0ICwuDr69vhe2/plq8eDGWLl2KyZMn48iRI/jtt9+kLgkAEBkZibfeegsdO3bEvn37EBYWBgsLi8fex8LCAj/++GOJ5QcPHsS1a9eeeH9dNXv2bLz99tvo168fdu7cibFjx2LOnDkYN26c1KWVm4WFBW7cuIF9+/aVWPfTTz/B0tJSgqqe3dWrV7Fq1SooFAr07NlT6nIAAIZSF6BPwsLCMGbMGHTt2hUbN26EUqlUr+vatSvee+897NixQ8IKK1dWVhZMTU1Rp04dqUupEOfPn8eoUaPQo0ePMm3v5OSEVq1aAQBat26N4OBg+Pj44JtvvkGvXr2eqZbs7GwYGxvjypUryM/Px5AhQxASEvJM+yxW/LrVZOfPn4eJiQnGjx9fYfvMzs6GiYnJM+0jOjoaADBq1Ci0aNGiTPd55ZVXsGrVKixatEjrx/DHH39EcHAw0tLSnqkmKSQlJeGzzz7DqFGjMGfOHABFZzby8/MxZcoUTJw4EfXr15e4yqdXu3ZtWFhY4KeffkLnzp3Vy9PT07Fu3ToMHjwYy5cvl7DC8mnfvj3u3bsHADh16hT++OMPiSvimZmnMmfOHMhkMixbtkwryBRTKBR47rnn1H+rVCrMnz9ffdrU0dERw4YNw+3bt7Xu16FDBwQEBCAsLAytW7eGiYkJPD09sXLlSgDA1q1b0bRpU5iamqJhw4YlAlPxqcEzZ86gX79+sLS0hJWVFYYMGaJ+wxVbu3YtunXrBhcXF5iYmKBevXr46KOPkJmZqbXdiBEjYG5ujqioKHTr1g0WFhbqD2Npp1nXrVuHli1bwsrKCqampvD29sbIkSO1trl16xaGDBkCR0dHKJVK1KtXD1999RVUKpV6m+LT0l9++SUWLFgALy8vmJubIzg4GMePH3/cy6N2/vx59O3bFzY2NjA2Nkbjxo3xyy+/qNcXn4YtKCjA4sWL1ad6n1adOnXg4OCAmJgY9bJTp07hueeeg62tLYyNjdGkSRP8+eefWvcrPv6uXbswcuRIODg4wNTUFAMHDkTbtm0BFP1gPXy5YPPmzQgODoapqSksLCzQtWtXhIWFae27+L0QERGB/v37w8bGRh0+PT090bt3b2zZsgVNmjRRv/5btmxR11WvXj2YmZmhRYsWOHXqlNa+T506hQEDBsDT01P9Hh04cKDW49d8fPv378eYMWNgb28POzs79OvXD3FxcVrb5ufn48MPP4SzszNMTU3Rtm1bnDx5stTnOyEhAW+++SZq1aoFhUKhvgxRUFDw2NdJJpNhxYoVyM7OLnFaPycnBx9//DG8vLygUCjg5uaGcePGlbjUU/zcrV+/Hk2aNIGxsTFmzpz52OP+9NNPaNSoEYyNjWFra4sXXngBFy9eVK/v0KEDhgwZAgBo2bIlZDJZmS6tDRw4EAC0fkBSU1Px119/lfjMFUtOTsbYsWPh5uYGhUIBb29vTJ48Gbm5uVrbpaWlYdSoUbCzs4O5uTlCQ0Nx5cqVUvf5zz//YNCgQVqf50WLFj2x/tLs2LEDOTk5ePXVV7WWv/rqqxBCYOPGjY+9/7179zB27FjUr18f5ubmcHR0RKdOnXD48GGt7Z72++Xnn3+Gn5+f+vH9+uuvT/3YRo4cifXr12u9p9asWQMAjzwbfOTIEXTu3BkWFhYwNTVF69atSz0DfPz4cbRp0wbGxsZwdXXFxx9//MjLcmvXrkVwcDDMzMxgbm6O7t2748yZM0/9eABALtfB6CCoTAoKCoSpqalo2bJlme/zxhtvCABi/PjxYseOHWLJkiXCwcFBuLu7i3v37qm3CwkJEXZ2dsLPz0/8+OOPYufOnaJ3794CgJg5c6Zo2LCh+OOPP8S2bdtEq1athFKpFHfu3FHff/r06QKA8PDwEB988IHYuXOnWLBggTAzMxNNmjQReXl56m0//fRT8fXXX4utW7eKAwcOiCVLlggvLy/RsWNHrdqHDx8ujIyMhKenp5g7d67Yu3ev2Llzp3qdh4eHettjx44JmUwmBgwYILZt2yb27dsnVq5cKYYOHareJjExUbi5uQkHBwexZMkSsWPHDjF+/HgBQIwZM0a93Y0bNwQA4enpKUJDQ8XGjRvFxo0bRcOGDYWNjY1ISUl57HN+6dIlYWFhIerUqSN+/fVXsXXrVjFw4EABQMybN09dS1hYmAAg+vfvL8LCwkRYWNhj9wtAjBs3TmtZcnKykMvlonXr1kIIIfbt2ycUCoVo166dWLt2rdixY4cYMWKEACBWrlypvt/KlSsFAOHm5ibeeOMNsX37dvG///1PXL16VSxatEgAEHPmzBFhYWEiOjpaCCHEqlWrBADRrVs3sXHjRrF27VrRrFkzoVAoxOHDh9X71nwvTJo0SezevVts3LhRCCGEh4eHqFWrlggICFC/n1q2bCmMjIzEtGnTRJs2bcT69evFhg0bhK+vr3BychJZWVnqfa9bt05MmzZNbNiwQRw8eFCsWbNGhISECAcHB633c/Hj8/b2FhMmTBA7d+4UK1asEDY2NqW+z2Qymfjggw/Erl27xIIFC4Sbm5uwtLQUw4cPV28XHx8v3N3dhYeHh1i6dKnYs2eP+PTTT4VSqRQjRox47GsXFhYmevbsKUxMTNSvdWJiolCpVKJ79+7C0NBQTJ06VezatUt8+eWX6s9NTk6Oeh8eHh7CxcVFeHt7i59++kns379fnDx58pHHnDNnjgAgBg4cKLZu3Sp+/fVX4e3tLaysrMSVK1eEEEJER0eLKVOmqN8fYWFh4urVq4/cZ/HzGh4eLoYOHSpatGihXrd48WJhZmYm0tLSRIMGDURISIh6XXZ2tggMDBRmZmbiyy+/FLt27RJTp04VhoaGomfPnurtVCqV6Nixo1AqlWL27Nli165dYvr06cLb21sAENOnT1dvGx0dLaysrETDhg3Fr7/+Knbt2iXee+89IZfLxYwZM9TbFX+eNd//pfnoo48EAJGRkVFinb29vRg4cOBj73/p0iUxZswYsWbNGnHgwAGxZcsW8dprrwm5XC72799fop6yfL8UP999+/YVf//9t/j999+Fj4+P+n34JCEhIaJBgwYiLS1NmJmZiR9++EG9rmXLlmLYsGEiPDy8xPNz4MABYWRkJJo1aybWrl0rNm7cKLp16yZkMplYs2aNervo6Ghhamoq6tevL/744w+xadMm0b17d1G7dm0BQNy4cUO97ezZs4VMJhMjR44UW7ZsEevXrxfBwcHCzMxM/R0jxH/fH0+jtMcgBYaZMkpISBAAxIABA8q0/cWLFwUAMXbsWK3lJ06cEADEJ598ol4WEhIiAIhTp06plyUlJQkDAwNhYmKiFVwiIyMFAPHdd9+plxW/Ad955x2tYxX/AP7++++l1qhSqUR+fr44ePCgACDOnj2rXjd8+HABQPz0008l7vdwmPnyyy8FgMcGjeIvqxMnTmgtHzNmjJDJZOLy5ctCiP++bBo2bCgKCgrU2508eVIAEH/88ccjjyGEEAMGDBBKpVLcunVLa3mPHj2EqampVo2lBZRHKX4t8/PzRV5enrh48aLo0aOHACAWLVokhBDC399fNGnSROTn52vdt3fv3sLFxUUUFhYKIf77khw2bFiJ4+zfv18AEOvWrVMvKywsFK6urqJhw4bqfQghRHp6unB0dFSHKSH+ey9MmzatxL49PDyEiYmJuH37tnpZ8fvJxcVFZGZmqpdv3LhRABCbN29+5HNSUFAgMjIyhJmZmfj222/Vy4sf38Pv/fnz5wsAIj4+Xgjx32fkUe9bzTDz5ptvCnNzcxETE6O1bfF7T/MLuTTDhw8XZmZmWst27NghAIj58+drLV+7dq0AIJYtW6Ze5uHhIQwMDNTv08d58OCBMDEx0QoKQghx69YtoVQqxaBBg9TLNAPKk2huW/w+OX/+vBBCiObNm6tD3cNhZsmSJQKA+PPPP7X2N2/ePAFA7Nq1SwghxPbt2wUArddSiKIfwofDTPfu3UWtWrVEamqq1rbjx48XxsbGIjk5WQhR9jAzatQooVQqS13n6+srunXr9tj7P6ygoEDk5+eLzp07ixdeeEG9vKzfL8WfuaZNmwqVSqXe7ubNm8LIyOipwowQRe+/oKAgIURRCAEgDhw4UGoQaNWqlXB0dBTp6elajycgIEDUqlVLXc8rr7wiTExMREJCgtZ2/v7+WmHm1q1bwtDQUEyYMEGrvvT0dOHs7Cxefvll9TJ9DjM6eK6oeti/fz8AlDht3KJFC9SrVw979+7VWu7i4oJmzZqp/7a1tYWjoyMaN24MV1dX9fJ69eoBQIlT+wAwePBgrb9ffvllGBoaqmsBgOvXr2PQoEFwdnaGgYEBjIyM1G0zNE+BF3vxxRef+FibN2+uPt6ff/6JO3fulNhm3759qF+/fol2ASNGjIAQokQDuV69esHAwED9d2BgIIDSH/fDx+ncuTPc3d1LHCcrK6vEZZmn8cMPP8DIyAgKhQL16tXDsWPHMGvWLIwdOxZXr17FpUuX1K9BQUGB+tazZ0/Ex8fj8uXLWvsry3MLAJcvX0ZcXByGDh2qdXrX3NwcL774Io4fP46srKwy7btx48Zwc3NT/138furQoYNWu5rS3mcZGRmYNGkSfHx8YGhoCENDQ5ibmyMzM7PU947mJVeg5GtY/L581PtW05YtW9CxY0e4urpqPbfF7Z0OHjxY6uN9nOL33MOf0ZdeeglmZmYlPqOBgYFlaowdFhaG7OzsEvt1d3dHp06dSuy3PEJCQlCnTh389NNPiIqKQnh4+CMvMe3btw9mZmbo37+/1vLi+orredTrMWjQIK2/c3JysHfvXrzwwgswNTUt8V7Pyckp8yVhTY+71FuWy8BLlixB06ZNYWxsDENDQxgZGWHv3r2lvjef9P1S/JkbNGiQ1rE9PDzQunXrMj+mYiNHjsSpU6cQFRWFH3/8EXXq1EH79u1LbJeZmYkTJ06gf//+MDc3Vy83MDDA0KFDcfv2bfX3yP79+9G5c2c4OTlpbffKK69o7XPnzp0oKCjAsGHDtF4rY2NjhISElNozTh+xAXAZ2dvbw9TUFDdu3CjT9klJSQCKQsrDXF1dS/wo29ralthOoVCUWK5QKAAUfaE8zNnZWetvQ0ND2NnZqWvJyMhAu3btYGxsjM8++wy+vr4wNTVFbGws+vXrh+zsbK37m5qalqm1ffv27bFx40Z89913GDZsGHJzc9GgQQNMnjxZfX0/KSmp1O6MxUGtuMZidnZ2Wn8Xt1F6uMaHJSUlPfI5L+04T+Pll1/GBx98AJlMBgsLC9SpU0f9hXj37l0AwPvvv4/333+/1Pvfv39f6+/S6izNk95LKpUKDx480Aojj9r3o95PZXmfDRo0CHv37sXUqVPRvHlzWFpaQiaToWfPnqW+Lk96DYsf16Pet5ru3r2Lv//+G0ZGRqU+roef27JISkqCoaEhHBwctJbLZDI4OzuXeK9U1Ou1e/fup671YTKZDK+++iq+++475OTkwNfXF+3atXtkPc7OziUCgaOjIwwNDdX1Fj8fDz/3D78+SUlJKCgowPfff4/vv/++1GM+7ethZ2eHnJycUhurJycna/1HrzQLFizAe++9h9GjR+PTTz+Fvb09DAwMMHXq1FLDTHnfm8XLnrbbc/v27VG3bl0sXboUf/75JyZOnFhqQHvw4AGEEGX6Dit+XUurT1Pxd1PxfzofppPtX8qBYaaMDAwM0LlzZ2zfvh23b99GrVq1Hrt98YclPj6+xLZxcXGwt7ev8BoTEhK0/tddUFCApKQkdS379u1DXFwcDhw4oNVT5lHjWjxNo9i+ffuib9++yM3NxfHjxzF37lwMGjQInp6eCA4Ohp2dHeLj40vcr7hBaEU9H5V5HAcHBwQFBZW6rni/H3/8Mfr161fqNn5+flp/l/X51XwvPSwuLg5yuRw2Njbl2ndZpaamYsuWLZg+fTo++ugj9fLc3Nxyj61U/Lge9b7VZG9vj8DAQMyePbvUfWmevXya4xcUFODevXtagUYIgYSEhBJf/hX1elXUe33EiBGYNm0alixZ8sjnpbieEydOQAih9RgSExNRUFCgrqf4+dD8zgCKXh9NNjY26jMFj+o27eXl9VSPpWHDhgCAqKgotGzZUuvY9+/fR0BAwGPv//vvv6NDhw5YvHix1vL09PSnqqOY5nvzYaUtK4tXX30VU6ZMgUwmw/Dhw0vdxsbGBnK5vEzfYXZ2dmWqr3j7//3vf/Dw8ChX7fqgekSyKvLxxx9DCIFRo0YhLy+vxPr8/Hz8/fffAIBOnToBKPqQaQoPD8fFixe1uulVlFWrVmn9/eeff6KgoEDdI6b4i+zhnlhLly6tsBqUSiVCQkIwb948AFC3lu/cuTMuXLiAiIgIre1//fVXyGQydOzYsUKO37lzZ3Voe/g4pqam6q7VFc3Pzw9169bF2bNnERQUVOqtvON/+Pn5wc3NDatXr9YafCszMxN//fWXuodTZZLJZBBClHjvrFixAoWFheXaZ/H78lHvW029e/fG+fPnUadOnVKf2/KEmeLP4MOf0b/++guZmZnl/owGBwfDxMSkxH5v376tvgxaEdzc3PDBBx+gT58+j/xxBIoeZ0ZGRokeQcU9c4rrKf4MPvx6rF69WutvU1NTdOzYEWfOnEFgYGCpr8fDZz6eJDQ0FMbGxiUGXivuGff8888/9v4ymazEe/PcuXPlvqzs5+cHFxcX/PHHH1qfuZiYGBw7dqxc+xw+fDj69OmDDz74QCu8azIzM0PLli2xfv16rbOdKpUKv//+O2rVqqW+1NmxY0fs3btXfeYFAAoLC7F27VqtfXbv3h2Ghoa4du3aI7+bqgOemXkKwcHBWLx4McaOHYtmzZphzJgxaNCgAfLz83HmzBksW7YMAQEB6NOnD/z8/PDGG2/g+++/h1wuR48ePXDz5k1MnToV7u7ueOeddyq8vvXr18PQ0BBdu3ZFdHQ0pk6dikaNGuHll18GUDQ2io2NDUaPHo3p06fDyMgIq1atwtmzZ5/puNOmTcPt27fRuXNn1KpVCykpKfj222+12uO88847+PXXX9GrVy/MmjULHh4e2Lp1K3744QeMGTOmwgaGmz59urp9xbRp02Bra4tVq1Zh69atmD9/PqysrCrkOKVZunQpevToge7du2PEiBFwc3NDcnIyLl68iIiICKxbt65c+5XL5Zg/fz4GDx6M3r17480330Rubi6++OILpKSk4PPPP6/gR1KSpaUl2rdvjy+++AL29vbw9PTEwYMH8eOPP8La2rpc+6xXrx6GDBmCb775BkZGRujSpQvOnz+PL7/8ssTlzVmzZmH37t1o3bo13nrrLfj5+SEnJwc3b97Etm3bsGTJkieeLX1Y165d0b17d0yaNAlpaWlo06YNzp07h+nTp6NJkyYYOnRouR6XtbU1pk6dik8++QTDhg3DwIEDkZSUhJkzZ8LY2BjTp08v135LU5bXftiwYVi0aBGGDx+OmzdvomHDhjhy5AjmzJmDnj17okuXLgCAbt26oX379vjwww+RmZmJoKAgHD16tNQBBr/99lu0bdsW7dq1w5gxY+Dp6Yn09HRcvXoVf//9d6mDxD2Ora0tpkyZgqlTp8LW1hbdunVDeHg4ZsyYgddff/2JY8z07t0bn376KaZPn46QkBBcvnwZs2bNgpeX1xO77pdGLpfj008/xeuvv44XXngBo0aNQkpKCmbMmFHqpZ2ycHV1fWIXcwCYO3cuunbtio4dO+L999+HQqHADz/8gPPnz+OPP/5Q/6d0ypQp2Lx5Mzp16oRp06bB1NQUixYtKjHMhqenJ2bNmoXJkyfj+vXrCA0NhY2NDe7evYuTJ0/CzMzsicMMlOZ///sfgKJ2mEDR0A3F7Xwebp9VJaRre6y/IiMjxfDhw0Xt2rWFQqFQd+WcNm2aSExMVG9XWFgo5s2bJ3x9fYWRkZGwt7cXQ4YMEbGxsVr702z1rsnDw0P06tWrxHI81AunuAX66dOnRZ8+fYS5ubmwsLAQAwcOFHfv3tW677Fjx0RwcLAwNTUVDg4O4vXXXxcRERElWqOX1vtDc51ma/4tW7aIHj16CDc3N6FQKISjo6Po2bOnVpdhIYSIiYkRgwYNEnZ2dsLIyEj4+fmJL774QquHTnFvgy+++KLUx63Zo+JRoqKiRJ8+fYSVlZVQKBSiUaNGpba0f/h5fJyybnv27Fnx8ssvC0dHR2FkZCScnZ1Fp06dxJIlS9TbPK4HS2m9mYpt3LhRtGzZUhgbGwszMzPRuXNncfToUa1tit8Lml2li5X1/SRE6a/D7du3xYsvvihsbGyEhYWFCA0NFefPnxceHh5aPY8e9fiKH5tmV9nc3Fzx3nvvCUdHR2FsbCxatWolwsLCSuxTCCHu3bsn3nrrLeHl5SWMjIyEra2taNasmZg8eXKpXXo1Per9nJ2dLSZNmiQ8PDyEkZGRcHFxEWPGjBEPHjwo03P3OCtWrBCBgYFCoVAIKysr0bdv3xK9rsrbm+lxHu7NJERR78jRo0cLFxcXYWhoKDw8PMTHH3+s1f1cCCFSUlLEyJEjhbW1tTA1NRVdu3YVly5dKvWzd+PGDTFy5Ejh5uYmjIyMhIODg2jdurX47LPPtLZ5+Lvlcb799lvh6+srFAqFqF27tpg+fbrW0BKPkpubK95//33h5uYmjI2NRdOmTcXGjRtLfFc97ffLihUrRN26dYVCoRC+vr7ip59+KrHPR3nU97qmR/UEOnz4sOjUqZMwMzMTJiYmolWrVuLvv/8ucf+jR4+qh+twdnYWH3zwgVi2bFmJrtlCFH1/dOzYUVhaWgqlUik8PDxE//79xZ49e9TbPE1vJgCPvElB9m9RpMdmzJiBmTNn4t69e5XSFoeIiEiXsc0MERER6TWGGSIiItJrvMxEREREeo1nZoiIiEivMcwQERGRXmOYISIiIr1W7QfNU6lUiIuLg4WFRYUP8U5ERESVQwiB9PR0uLq6PnEOqWofZuLi4krMoExERET6ITY29okjfFf7MFM8H05sbGyZZoAmIiIi6aWlpcHd3b1M89pV+zBTfGnJ0tKSYYaIiEjPlKWJCBsAExERkV5jmCEiIiK9xjBDREREeo1hhoiIiPQawwwRERHpNYYZIiIi0msMM0RERKTXGGaIiIhIrzHMEBERkV5jmCEiIiK9xjBDREREeo1hhoiIiPQaw8wzOH77OJKzk6Uug4iIqEZjmCmn93e9j+Afg/HF0S+kLoWIiKhG05kwM3fuXMhkMkycOFG9bMSIEZDJZFq3Vq1aSVekhhCPEADAtye+RUJGgsTVEBER1Vw6EWbCw8OxbNkyBAYGllgXGhqK+Ph49W3btm0SVFhSb9/eaFWrFbILsjH70GypyyEiIqqxJA8zGRkZGDx4MJYvXw4bG5sS65VKJZydndU3W1tbCaosSSaTYXanohCz9PRS3Ey5KW1BRERENZTkYWbcuHHo1asXunTpUur6AwcOwNHREb6+vhg1ahQSExMfu7/c3FykpaVp3SpLJ69O6OzVGfmqfMw6OKvSjkNERESPJmmYWbNmDSIiIjB37txS1/fo0QOrVq3Cvn378NVXXyE8PBydOnVCbm7uI/c5d+5cWFlZqW/u7u6VVT4AqM/O/HL2F1y6f6lSj0VEREQlyYQQQooDx8bGIigoCLt27UKjRo0AAB06dEDjxo3xzTfflHqf+Ph4eHh4YM2aNejXr1+p2+Tm5mqFnbS0NLi7uyM1NRWWlpYV/jgA4Pk1z2PT5U14qf5L+POlPyvlGERERDVJWloarKysyvT7LdmZmdOnTyMxMRHNmjWDoaEhDA0NcfDgQXz33XcwNDREYWFhifu4uLjAw8MD//zzzyP3q1QqYWlpqXWrbJ92/BQyyLDuwjqciT9T6ccjIiKi/0gWZjp37oyoqChERkaqb0FBQRg8eDAiIyNhYGBQ4j5JSUmIjY2Fi4uLBBU/WkOnhhjYcCAAYMr+KRJXQ0REVLNIFmYsLCwQEBCgdTMzM4OdnR0CAgKQkZGB999/H2FhYbh58yYOHDiAPn36wN7eHi+88IJUZT/SzA4zYSAzwLZ/tuHIrSNSl0NERFRjSN6b6VEMDAwQFRWFvn37wtfXF8OHD4evry/CwsJgYWEhdXkl+Nj64LUmrwEAPtn7CSRqikRERFTjSNYAuKo8TQOiZ3U77TZ8vvNBbmEudgzege4+3Sv1eERERNWVXjQAro5qWdbC2OZjAQCT903m2RkiIqIqwDBTwT5u+zHMFeY4HX8aGy5tkLocIiKiao9hpoI5mDngnVbvAACm7JuCQlXJLuZERERUcRhmKsF7we/BxtgGF+9fxKqoVVKXQ0REVK0xzFQCK2MrTGozCQAw48AM5BXmSVwRERFR9cUwU0nGtxgPZ3Nn3Ei5gR8jfpS6HCIiomqLYaaSmCnMMKVd0WjAnx76FFn5WRJXREREVD0xzFSiUc1GwdPaE/EZ8Vh0cpHU5RAREVVLDDOVSGGgwPSQ6QCAz49+jtScVIkrIiIiqn4YZirZkMAh8Lf3R3J2Mr4+/rXU5RAREVU7DDOVzFBuiE87fgoA+CrsK9zPui9xRURERNULw0w5FRSqcPx6Eq7dy3jitv3q9UNTl6bIyMvA50c+r4LqiIiIag6GmXKa8Xc0Biw7jt+PxzxxW7lMjs86fgYAWHhyIe6k3ans8oiIiGoMhplyalfXAQCw+8LdMk0oGeoTira12yK3MBefHfqssssjIiKqMRhmyqldXXsoDeW4/SAblxLSn7i9TCbDnE5zAAArzqzAteRrlV0iERFRjcAwU06mCkO09bEHUHR2pizaebRDqE8oClQFmHFwRiVWR0REVHMwzDyDrvWdAAB7LpYtzABQt51ZdW4Vzieer5S6iIiIahKGmWfQuZ4TZDLg3O1UJKTmlOk+zVyb4cV6L0JAYNr+aZVcIRERUfXHMPMMHCyUaOJuDQDY/RRnZ2Z1nAW5TI4NlzYg/E54JVVHRERUMzDMPKOu9Z0BlL3dDADUd6iPoYFDAQCT902ulLqIiIhqCoaZZ1Tcbibs2n2k5+SX+X7TQ6bDSG6E3dd348DNA5VUHRERUfXHMPOM6jiYwcveDPmFAgev3Cvz/bxsvDCq6SgARWdnyjJWDREREZXEMPOMZDKZ+uzM01xqAoAp7afAxNAEx2KPYds/2yqjPCIiomqPYaYCFIeZ/ZcSkV+oKvP9XCxcMKHFBABFZ2dUouz3JSIioiIMMxWgaW0b2JkpkJZTgPAbyU913w/bfAhLpSXO3j2LddHrKqlCIiKi6othpgIYyGXo5O8IANj1lJea7Ezt8F7wewCAaQemoUBVUOH1ERERVWcMMxVEs93M0zbmfafVO7A3tceVpCv49eyvlVEeERFRtcUwU0Ha1XWAsZEcd1KycTH+yRNParJQWuDjth8DAGYcmIHcgtzKKJGIiKhaYpipICYKA7T1cQDw9L2aAGBM0Bi4WbghNi0WS08vrejyiIiIqi2GmQrUtX5Ru5ndFxOe+r4mRiaY2n4qAGD24dnIzMus0NqIiIiqK4aZCtTJv2jiyfN30hCXkv3U9x/ZZCS8bbyRmJmI7058VwkVEhERVT8MMxXIwUKJprVtAAB7n2LiyWJGBkaY1WEWAGD+sfl4kP2gQusjIiKqjhhmKlhxr6an7aJdbEDAAAQ4BiAlJwVfHvuyIksjIiKqlhhmKlhxmDl+PQlpTzHxZDEDuQE+7fgpAOCbE9/gbkb5QhEREVFNwTBTweo4mMPb4d+JJy+XfeJJTX39+qK5a3Nk5Wdh7pG5FVwhERFR9cIwUwnKO/FkMZlMhjmd5wAAFp9ajFuptyqsNiIiouqGYaYSdK3378STl59u4klNnb06o6NnR+QV5mHWwVkVWR4REVG1wjBTCZr8O/Fkek4BTlx/uokni8lkMszuNBsA8HPkz7iSdKUiSyQiIqo2GGYqgYFchs71igbQ21OOLtrFgt2D0du3NwpFIaYfmF5R5REREVUrDDOVpGt9ZwDlm3hS02cdPwMArDm/BmcTzlZIbURERNUJw0wlaetjr5548kJ8Wrn308i5EQYEDAAATNk/paLKIyIiqjYYZiqJicIA7eqWf+JJTTM7zISBzABbrmxBWGxYRZRHRERUbTDMVKJn7aJdzNfOFyMajwAAfLLvk2e6bEVERFTdMMxUok7+jpDJgOi4NNwpx8STmqaFTIPCQIEDNw9g7429FVQhERGR/mOYqUT25ko0+3fiyT3PeHamtlVtjAkaAwD4ZC/PzhARERVjmKlkFXWpCQA+bvsxzIzMEB4Xjk2XNz3z/oiIiKoDhplK9qwTT2pyMnfC2y3fBgBM2TcFharCZ66PiIhI3zHMVDJvB3PUcTBDgUrgQDknntT0fuv3YW1sjeh70Vhzfk0FVEhERKTfGGaqgOYAes/KxsQGH7b+EAAw7cA05Bc+29keIiIifccwUwWKLzUduJSIvILyTTyp6a2Wb8HRzBHXH1zHT2d+eub9ERER6TOGmSrQxN0a9uYKpOcW4MSNpGfen5nCDFPaFY0GPOvQLGTnP1u3byIiIn3GMFMF5HIZOvtXXK8mAHij2RuobVUbcelxWHxqcYXsk4iISB/pTJiZO3cuZDIZJk6cqF4mhMCMGTPg6uoKExMTdOjQAdHR0dIV+QyKLzXtecaJJ4spDZWYHlI0k/acw3OQllv++Z+IiIj0mU6EmfDwcCxbtgyBgYFay+fPn48FCxZg4cKFCA8Ph7OzM7p27Yr09HSJKi2/tnXtYWJkgLjUHETHVUzwGNZoGHztfJGUnYRvjn9TIfskIiLSN5KHmYyMDAwePBjLly+HjY2NerkQAt988w0mT56Mfv36ISAgAL/88guysrKwevVqCSsuH2MjA7Sraw+g4i41GcoN8WnHTwEAXx77EklZz94eh4iISN9IHmbGjRuHXr16oUuXLlrLb9y4gYSEBHTr1k29TKlUIiQkBMeOHXvk/nJzc5GWlqZ10xUVORpwsf71+6ORUyOk56Vj/tH5FbZfIiIifSFpmFmzZg0iIiIwd+7cEusSEhIAAE5OTlrLnZyc1OtKM3fuXFhZWalv7u7uFVv0M+hczwlyGXAhPg23H2RVyD7lMjlmd5oNAPj+5PeIS4+rkP0SERHpC8nCTGxsLN5++238/vvvMDY2fuR2MplM628hRIllmj7++GOkpqaqb7GxsRVW87OyNVMgyMMWwLNPPKmpZ92eaO3eGtkF2Zh9aHaF7ZeIiEgfSBZmTp8+jcTERDRr1gyGhoYwNDTEwYMH8d1338HQ0FB9RubhszCJiYklztZoUiqVsLS01Lrpki71HQEAuy9WXJiRyWSY02kOAGBZxDLceHCjwvZNRESk6yQLM507d0ZUVBQiIyPVt6CgIAwePBiRkZHw9vaGs7Mzdu/erb5PXl4eDh48iNatW0tV9jMrntrgxPVkpGZX3FQEIZ4h6OrdFQWqAsw8OLPC9ktERKTrJAszFhYWCAgI0LqZmZnBzs4OAQEB6jFn5syZgw0bNuD8+fMYMWIETE1NMWjQIKnKfmZe9mbwcTT/d+LJxArdd3Hbmd/O/YYz8WcqdN9ERES6SvLeTI/z4YcfYuLEiRg7diyCgoJw584d7Nq1CxYWFlKX9kwqo1cTADR3a45XGrwClVDh9b9fR4GqoEL3T0REpItkoiKGo9VhaWlpsLKyQmpqqs60n4m49QD9fjgGC6UhTk/tCoVhxWXKuxl3UW9RPTzIeYD5XebjgzYfVNi+iYiIqsrT/H7r9JmZ6qpxLWs4WCiRnluA49crdqA7J3MnfNXtKwDAtAPTcDX5aoXun4iISNcwzEhALpehS71/ezVV8KUmABjReAQ6e3VGTkEO3tzyZoXMBUVERKSrGGYk0qXevxNPXqyYiSc1yWQyLO29FCaGJth3Yx9+jvy5QvdPRESkSxhmJNLGp2jiyfjUHJy/U/FTLtSxrYNZHWcBAN7d9S4SMh49ajIREZE+Y5iRiLGRAdr7Fk88WTlBY2KriWjq0hQpOSl4a/tblXIMIiIiqTHMSKh4AL1dldBuBiiaVfvH536EgcwA6y6sw6ZLmyrlOERERFJimJFQJ39HyGXApYR0xCZXzMSTD2vs3BgftC7qnj1221ik5qRWynGIiIikwjAjIVszBYI8/514sgLnanrYtJBp8LH1QVx6HD7a81GlHYeIiEgKDDMS61ZJowFrMjEywbLeywAAS04vweGYw5V2LCIioqrGMCOx4qkNTtxIRmpWxU08+bCOXh3xepPXAQCj/h6FnIKcSjsWERFRVWKYkZiHnRnqOpqjUCWwv4InnnzYF92+gLO5My4nXcZnhz6r1GMRERFVFYYZHVBZE08+zNrYGot6LgIAzDs6D+funqvU4xEREVUFhhkdUBxmDlxORG5BYaUeq1+9fnjB/wUUqArw+ubXUaiq3OMRERFVNoYZHdColjUcLZTIzCvE8evJlX68hT0XwkpphfC4cHx/8vtKPx4REVFlYpjRAXK5DJ3rFV9qqvxpB1wtXPFF1y8AAJP3TcaNBzcq/ZhERESVhWFGRxR30d5zIbFKZrl+relrCPEIQVZ+FkZvHc2ZtYmISG8xzOiI4Dp2MFUYICEtB1F3Kn+UXrlMjmV9lkFpoMSua7vw+7nfK/2YRERElYFhRkcYGxmgfV0HAJXfq6mYr50vpodMBwBM3DkRiZmV2zWciIioMjDM6JCq6qKt6f3W76ORUyMkZydj4o6JVXZcIiKiisIwo0M6+TvCQC6r1IknH2ZkYIQVz62AXCbHH+f/wNYrW6vkuERERBWFYUaH2JgpEORhAwDYVYVnZ4Jcg/BOq3cAAGO2jkF6bnqVHZuIiOhZMczomK7qXk1VF2YAYGaHmfCy9kJsWiwm75tcpccmIiJ6FgwzOqZbfWcAwMmbyUjJyquy45opzLCsT9HM2gtPLkRYbFiVHZuIiOhZMMzomNp2pvBzsqiSiScf1sW7C0Y0HgEBgdc2v4bcgtwqPT4REVF5MMzooC71HQFUba+mYl92/RKOZo64eP8iPj/yeZUfn4iI6GkxzOigrv9eajp4+V6lTzz5MDtTO3wX+h0AYPbh2bhw70KVHp+IiOhpMczooEA3K/XEk8euJVX58V9u8DL6+PZBviqfM2sTEZHOY5jRQXK5DF0kGECvmEwmww+9foCFwgJht8Ow+NTiKq+BiIiorBhmdFRxF+29F+9Cpar6SSBrWdbC512K2sx8vPdj3Eq9VeU1EBERlQXDjI5qXccOZgoD3E3LrZKJJ0szOmg02ri3QUZeBsZsHcOZtYmISCcxzOgopaEBQvyqduLJh8llcizvsxwKAwW2/bMNa86vkaQOIiKix2GY0WFd6knXbqZYPYd6mNJuCgDgrR1v4X7WfclqISIiKg3DjA4rnnjy8t103EqqmoknSzOp7SQEOAbgftZ9vLfrPcnqICIiKg3DjA6zNlWguWfxxJMJktWhMFBgRZ8VkEGGX8/+ip1Xd0pWCxER0cMYZnRc8QB6Ul5qAoCWtVrirZZvAQDe3PImMvIyJK2HiIioGMOMjuv2bxft8JvJeJBZdRNPluazTp+htlVtxKTGYNr+aZLWQkREVIxhRse525rC39kCKoEqn3jyYeYKcyztvRQA8O2Jb3HyzklJ6yEiIgIYZvRCVwlHA35YqE8ohgQOgUqo8Prm15FXKO3ZIiIiIoYZPVAcZg5euYecfOnnSfq6+9ewN7VHVGIUvjj6hdTlEBFRDccwowcCXK3gZKlEVl4hwiSYePJh9qb2+Kb7NwCAWYdm4dL9S9IWRERENRrDjB6Qy2XqAfR26cClJgAY1HAQevj0QF5hHt74+w2ohErqkoiIqIZimNETxZea9kg08eTDZDIZFvdaDDMjMxy+dRjLTi+TuiQiIqqhGGb0RHAdO5grDXEvPRdnb6dIXQ4AwMPaA3M6zwEAfLj7Q9xJuyNxRUREVBMxzOgJpaEBQnyLJp7cc1E3LjUBwLjm49DSrSXS89IxdttYzqxNRERVjmFGj+hSF+1iBnIDrHhuBYzkRth8eTP+d+F/UpdEREQ1DMOMHunoVzTx5JW7GYhJypS6HLUAxwB83PZjAMD47eORnJ0scUVERFSTMMzoEStTI7TwtAWgW2dnAOCTdp/A394fiZmJ+GDXB1KXQ0RENQjDjJ4pvtSkK120iykNlVjRZwUA4KfIn7D3+l6JKyIiopqCYUbPFIeZUzeTkSzxxJMPa1O7DcYGjQUAvLHlDWTlZ0lcERER1QQMM3pGc+LJfZeknXiyNHO7zEUty1q4/uA6ZhyYIXU5RERUAzDM6KFu6l5NCRJXUpKl0hKLey0GAHwV9hUi4iMkroiIiKo7hhk91LW+MwDg0JX7OjHx5MN6+/bGKw1egUqo8Nrm15BfmC91SUREVI1JGmYWL16MwMBAWFpawtLSEsHBwdi+fbt6/YgRIyCTybRurVq1krBi3RDgZglnS2Nk5xfi2LX7UpdTqm9Dv4WNsQ0iEyKxIGyB1OUQEVE1JmmYqVWrFj7//HOcOnUKp06dQqdOndC3b19ER0ertwkNDUV8fLz6tm3bNgkr1g0ymQxd6jsC0L0u2sWczJ3wdfevAQAzDs7AP0n/SFwRERFVV5KGmT59+qBnz57w9fWFr68vZs+eDXNzcxw/fly9jVKphLOzs/pma2srYcW6o/hS056LiTox8WRphjUahi7eXZBTkIM3trzBqQ6IiKhS6EybmcLCQqxZswaZmZkIDg5WLz9w4AAcHR3h6+uLUaNGITHx8T14cnNzkZaWpnWrjlp526onnozUkYknHyaTybC091KYGJrgwM0DWHp6qdQlERFRNSR5mImKioK5uTmUSiVGjx6NDRs2oH79+gCAHj16YNWqVdi3bx+++uorhIeHo1OnTsjNzX3k/ubOnQsrKyv1zd3dvaoeSpVSGhogxK9o4kldvdQEAN423vi046cAgPHbxmPVuVUSV0RERNWNTEh87j8vLw+3bt1CSkoK/vrrL6xYsQIHDx5UBxpN8fHx8PDwwJo1a9CvX79S95ebm6sVdtLS0uDu7o7U1FRYWlpW2uOQwqbIO3h7TSR8HM2x590Qqct5pEJVIV7/+3X8HPkzZJBheZ/leK3pa1KXRUREOiwtLQ1WVlZl+v02rKKaHkmhUMDHxwcAEBQUhPDwcHz77bdYurTkJQkXFxd4eHjgn38e3ZhUqVRCqVRWWr26pIOfIwzlMlxNzMCN+5nwsjeTuqRSGcgN8ONzP8LYwBhLTi/B63+/jpyCHIxrMU7q0oiIqBqQ/DLTw4QQj7yMlJSUhNjYWLi4uFRxVbrJysQILb2LGkTv0eFLTQAgl8nxQ68fMLHlRABFs2t/eexLaYsiIqJqQdIw88knn+Dw4cO4efMmoqKiMHnyZBw4cACDBw9GRkYG3n//fYSFheHmzZs4cOAA+vTpA3t7e7zwwgtSlq1TutYrHg1Yt8MMUNQgeEH3Bfik7ScAgA92f4BPD37KXk5ERPRMJA0zd+/exdChQ+Hn54fOnTvjxIkT2LFjB7p27QoDAwNERUWhb9++8PX1xfDhw+Hr64uwsDBYWFhIWbZO6VI88WSM7k08WRqZTIbZnWerGwVPOzANk/dNZqAhIqJyk7wBcGV7mgZE+qrHt4dxMT4NX/QPxEtB+tN766tjX+H93e8DAN5u+Ta+7v41ZDKZxFUREZEueJrfb51rM0NPr2t9/bnUpOm91u9hUc9FAIBvT3yLMVvHQCVUEldFRET6hmGmGiieRfvwP7o58eTjjG0+Fj8+9yNkkGHp6aV4ddOrKFAVSF0WERHpEYaZaqCBqyVcrYomnjzyj25OPPk4I5uMxKp+q2AgM8CvZ3/F4PWDOdM2ERGVGcNMNVA08WTR2Zk9F/XrUlOxgQ0H4s+X/oSR3Ah/Rv+Jl9a9hNyCR4/0TEREVIxhpproqg4zujvx5JP0q9cPGwdshNJAiU2XN6Hvmr7Iys+SuiwiItJxDDPVREsvO1goDXE/IxdnYlOkLqfcetbtia2DtsLUyBQ7r+1Er9W9kJGXIXVZRESkwxhmqgmFoVwvJp4si87enbFj8A5YKCxw4OYBdP+9O1JzUqUui4iIdBTDTDXyXxftBIkreXbtPNphz7A9sDa2xrHYY+jyWxckZydLXRYREekghplqpHjiyWv3MnH0qv71anpYC7cW2D98P+xN7XEq7hQ6/tIRiZmJUpdFREQ6hmGmGrEyMcJLQbUAAG/9cQbxqdkSV/TsGjs3xoHhB+Bs7oxzd88h5OcQxKXHSV0WERHpEIaZamZa7wao52KJpMw8jF0VgdwC/RpErzQNHBvg0IhDqGVZC5fuX0L7le0RkxIjdVlERKQjGGaqGROFAZYMaQpLY0OcuZWCz7ZclLqkClHXri4OjTgEL2svXHtwDe1/bo9rydekLouIiHQAw0w15GFnhm8HNAEA/HY8Bn+dvi1xRRXDy8YLh149BF87X9xKvYV2K9vh4r3qEdaIiKj8GGaqqY7+jni7c10AwCcbohAdVz26NteyrIWDIw6igUMDxGfEI+TnEJy7e07qsoiISEIMM9XY253rooOfA3ILVBjzewRSs6rHfEfO5s44MOIAmjg3wb2se+j4S0ecijsldVlERCQRhplqTC6X4ZtXGqOWjQluJWfhnT8j9Xaqg4fZm9pj3/B9aFWrFZKzk9H51844FntM6rKIiEgCDDPVnLWpAkuGNIPSUI59lxLx/b6rUpdUYayNrbFryC6092iPtNw0dPutGw7cPCB1WUREVMUYZmqAADcrfPZ8AADgm71XsP9y9Rl4zkJpge2Dt6Ord1dk5meix6oe2Hl1p9RlERFRFWKYqSFeCnLH4Ja1IQQwcU0kYpOrz2zUpkam2DxwM3r79kZOQQ6eW/McNl3aJHVZRERURcoVZmJjY3H79n/dfU+ePImJEydi2bJlFVYYVbxpfeqjkbs1UrPzMfr308jJ1/8B9YoZGxrjr5f/wov1XkReYR76r+uPP6P/lLosIiKqAuUKM4MGDcL+/fsBAAkJCejatStOnjyJTz75BLNmzarQAqniKA0NsHhwU9iaKRAdl4YpG89DiOrRIBgAFAYKrOm/BoMbDkaBqgAD/xqIX8/+KnVZRERUycoVZs6fP48WLVoAAP78808EBATg2LFjWL16NX7++eeKrI8qmKu1Cb4f2ARyGfC/07ex+uQtqUuqUIZyQ/zy/C94vcnrUAkVhm8cjmWnecaQiKg6K1eYyc/Ph1KpBADs2bMHzz33HADA398f8fHxFVcdVYo2Pvb4oLs/AGDG5micufVA4ooqloHcAEv7LMX45uMBAG9ueRPfHv9W4qqIiKiylCvMNGjQAEuWLMHhw4exe/duhIaGAgDi4uJgZ2dXoQVS5Rgd4o3uDZyQXygwdlUEkjJypS6pQsllcnzX4zt80PoDAMDEnRPx+ZHPJa6KiIgqQ7nCzLx587B06VJ06NABAwcORKNGjQAAmzdvVl9+It0mk8nw5UuN4G1vhvjUHEz44wwKClVSl1WhZDIZ5nWZh+kh0wEAH+/9GNP3T69W7YSIiAiQiXJ+sxcWFiItLQ02NjbqZTdv3oSpqSkcHR0rrMBnlZaWBisrK6SmpsLS0lLqcnTOlbvpeH7RUWTlFWJMhzqYFOovdUmVYt6Refho70cAgA9af4B5XeZBJpNJXBURET3K0/x+l+vMTHZ2NnJzc9VBJiYmBt988w0uX76sU0GGnszXyQLzXgwEACw+cA07zidIXFHlmNR2Er4NLWo388WxL/DW9regEtXrTBQRUU1VrjDTt29f/PprUZfXlJQUtGzZEl999RWef/55LF68uEILpMrXp5ErXmvrBQB4f91ZXLuXIXFFleOtlm9hae+lkEGGheEL8ebfb6JQVX3G2iEiqqnKFWYiIiLQrl07AMD//vc/ODk5ISYmBr/++iu+++67Ci2QqsZHPfzRwssWGbkFGP3baWTmFkhdUqV4o9kb+OX5XyCXybHizAoM3zgcuQXVq/EzEVFNU64wk5WVBQsLCwDArl270K9fP8jlcrRq1QoxMTEVWiBVDSMDORYOagJHCyX+SczApL/OVduGskMbDcWaF9fAUG6IVVGr0GRpE864TUSkx8oVZnx8fLBx40bExsZi586d6NatGwAgMTGRjWz1mKOFMX4Y3BSGchm2nIvHT0dvSl1SpXmpwUv4e+DfcDJzwsX7F9H2p7aYsG0C0nPTpS6NiIieUrnCzLRp0/D+++/D09MTLVq0QHBwMICiszRNmjSp0AKpagV52mJKr3oAgDnbLuLkjWSJK6o8oT6huDDuAkY0HgEBgYXhCxGwOADb/9kudWlERPQUyt01OyEhAfHx8WjUqBHk8qJMdPLkSVhaWsLfX3e697Jr9tMTQmDi2khsioyDvbkSW99qCydLY6nLqlS7r+3GG1vewM2UmwCAIYFD8HX3r2Fvai9tYURENdTT/H6XO8wUu337NmQyGdzc3J5lN5WGYaZ8svIK8MKiY7h8Nx3NPGzwx6hWUBiW60Se3sjMy8S0/dPwzYlvoBIq2Jva49vQbzEwYCDHpCEiqmKVPs6MSqXCrFmzYGVlBQ8PD9SuXRvW1tb49NNPoVJx7I7qwFRhiCVDm8FCaYjTMQ8wZ9tFqUuqdGYKM3zV/SuEvRaGho4NcT/rPgavH4zef/RGbGqs1OUREdEjlCvMTJ48GQsXLsTnn3+OM2fOICIiAnPmzMH333+PqVOnVnSNJBEvezMseKUxAODnYzexKfKOtAVVkRZuLXDqjVP4tOOnUBgosO2fbaj/Q30sOrmIA+0REemgcl1mcnV1xZIlS9SzZRfbtGkTxo4dizt3dOdHj5eZnt2XOy9j4f6rMDEywIZxreHvXHOex4v3LuL1v19Xd91u494GK55bAX973WkXRkRUHVX6Zabk5ORSG/n6+/sjObn69n6pqd7p6ot2de2RnV+I0b+dRmp2vtQlVZl6DvVw+NXD+L7H9zBXmONo7FE0WtIIsw/NRn5hzXkeiIh0WbnCTKNGjbBw4cISyxcuXIjAwMBnLop0i4Fchm8HNIGbtQluJmXhvT8joVJVzwH1SiOXyTG+xXhEj41GD58eyCvMw5T9UxC0PAin4k5JXR4RUY1XrstMBw8eRK9evVC7dm0EBwdDJpPh2LFjiI2NxbZt29RTHegCXmaqOOdup6D/4jDkFarwQXc/jOvoI3VJVU4IgT/O/4G3d7yN+1n3IZfJ8U6rdzCzw0yYKcykLo+IqNqo9MtMISEhuHLlCl544QWkpKQgOTkZ/fr1Q3R0NFauXFmuokn3Bdayxqy+DQAAX+66jENX7klcUdWTyWQY1HAQLoy9gMENB0MlVPgq7Cs0XNwQe6/vlbo8IqIa6ZnHmdF09uxZNG3aFIWFujMTMc/MVLyP/jqHNeGxsDE1wt8T2qKWjanUJUlm2z/bMHrLaMSmFXXdfrXxq/iq21ewMbGRuDIiIv1W6WdmqGab8VwDNHSzwoOsfIz5PQI5+boTXqtaz7o9ET02GuOaj4MMMqyMXIl6i+rhrwt/SV0aEVGNwTBDT83YyACLhzSFtakRou6kYsbmaKlLkpSF0gILey7E4VcPw9/eH3cz76L/uv7ot7Yf4tLjpC6PiKjaY5ihcqllY4rvBjSBTAasCY/FmpO3pC5Jcm1qt0Hkm5GY2n4qDOWG2HBpA+ovqo8VEStQgVdziYjoIU/VZqZfv36PXZ+SkoKDBw+yzUwNsnDfP/hy1xUoDOX43+hgBNaylroknRB1NwqvbX4N4XHhAICOnh2xrM8y+NjWvB5gRETlUWltZqysrB578/DwwLBhw56peNIvYzv4oEs9R+QVqDDm9wgkZ+ZJXZJOaOjUEGGvhWFBtwUwMTTB/pv70XBxQ8w/Oh8FqgKpyyMiqlYqtDeTLuKZmcqXmp2PvguP4GZSFtrVtcfPr7aAgZyzTBe7/uA63tzyJvZc3wMAaOrSFD8+9yMaOzeWtjAiIh3G3kxUpaxMjLBkaDMYG8lx+J/7+Hr3FalL0ineNt7YNWQXVvZdCRtjG0TERyBoWRA+2fsJcgpypC6PiEjvMcxQhfB3tsTn/Yqmsli4/yp2X7grcUW6RSaTYUTjEbgw7gJeqv8SCkUh5h6Zi0ZLGuFQzCGpyyMi0msMM1Rhnm/ihhGtPQEA766NxI37mdIWpIOczZ3x50t/YuMrG+Fq4YorSVcQ8nMIxmwZg7TcNKnLIyLSSwwzVKE+6VkPzTxskJ5bgDG/n0ZWHhu7lqavf19Ej43GG03fAAAsOb0E9RfVx9+X/5a4MiIi/SNpmFm8eDECAwNhaWkJS0tLBAcHY/v27er1QgjMmDEDrq6uMDExQYcOHRAdXbMHaNN1CkM5fhjcFPbmSlxKSMfH66M4xsojWBtbY2mfpdg/fD98bH1wJ/0OnlvzHAb8bwDuZvAyHRFRWUkaZmrVqoXPP/8cp06dwqlTp9CpUyf07dtXHVjmz5+PBQsWYOHChQgPD4ezszO6du2K9PR0KcumJ3CyNMaiQU1gIJdhU2Qcfjl2U+qSdFoHzw44N/ocJrWZBAOZAdZGr4X3d954Z8c7uJ12W+ryiIh0ns51zba1tcUXX3yBkSNHwtXVFRMnTsSkSZMAALm5uXBycsK8efPw5ptvlml/7JotnRWHr+OzrRdhKJdhzRutEORpK3VJOi8iPgKjt4xWD7ZnJDfC0MChmNR2EnztfCWujoio6uhl1+zCwkKsWbMGmZmZCA4Oxo0bN5CQkIBu3bqpt1EqlQgJCcGxY8ckrJTK6rW2XugV6IIClcDYVRFITGc35Cdp6tIUJ14/gZ1DdqKDZwfkq/LxU+RP8F/oj5fXvYwz8WekLpGISOdIHmaioqJgbm4OpVKJ0aNHY8OGDahfvz4SEhIAAE5OTlrbOzk5qdeVJjc3F2lpaVo3koZMJsP8FwPh42iOxPRcDFp+ApcTeInwSWQyGbrV6Yb9w/fj2Mhj6OPbBwIC6y6sQ9NlTRH6eygOxRxiWyQion9JHmb8/PwQGRmJ48ePY8yYMRg+fDguXLigXi+TaY8kK4QosUzT3LlztaZYcHd3r7Ta6cnMlIZYOrQZHC2UuJqYgecWHsEfJ2/xh7iMgt2DsXngZpwbfQ6DGw6GXCbHzms7EfJzCNqubIstV7bwuSSiGk/n2sx06dIFderUwaRJk1CnTh1ERESgSZMm6vV9+/aFtbU1fvnll1Lvn5ubi9zcXPXfaWlpcHd3Z5sZid3PyMW7f57FoSv3AAC9A10wt19DWBgbSVyZfrn+4Dq+OPoFVkauRG5h0fs80CkQH7X5CC81eAmGckOJKyQiqhh62WammBACubm58PLygrOzM3bv3q1el5eXh4MHD6J169aPvL9SqVR39S6+kfTszZX4eURzfNTDHwZyGbaci0ev747g3O0UqUvTK9423ljcezFuvH0DH7b+EOYKc5y7ew6D1g+C30I/LD21lFMkEFGNI+mZmU8++QQ9evSAu7s70tPTsWbNGnz++efYsWMHunbtinnz5mHu3LlYuXIl6tatizlz5uDAgQO4fPkyLCwsynQM9mbSPadjHuCtP87gTko2jAxk+KhHPYxs4/nYy4dUugfZD7AofBG+PfEt7mfdBwC4mLvg3eB38WazN2GhLNvnhIhI1zzN77ekYea1117D3r17ER8fDysrKwQGBmLSpEno2rUrgKKzNDNnzsTSpUvx4MEDtGzZEosWLUJAQECZj8Ewo5tSs/Lx4V9nsTO6aHC4LvUc8UX/RrAxU0hcmX7KzMvEiogV+DLsS/XYNDbGNhjfYjzeavkW7E3tJa6QiOjp6E2YqQoMM7pLCIHfjsfgsy0XkVeogouVMb4b2ATNOR5NueUV5mHVuVWYd3QeLiddBgCYGpnijaZv4L3W76GWZS2JKyQiKhuGGQ0MM7rv/J1UTPjjDG7cz4SBXIZ3utTFmA4+MJDzslN5FaoKseHSBsw9MhcR8REAigbgG9ZoGD5s8yEH4CMinccwo4FhRj9k5BZg6sbz2HDmDgCgjY8dvn6lMRwtjCWuTL8JIbD7+m7MOTwHB2MOAgBkkKF//f74uO3HaOLS5Al7ICKSBsOMBoYZ/SGEwP9O38a0TdHIzi+EvbkCX7/SGO3qOkhdWrUQFhuGuUfm4u8r/83MHeoTio/bfox2tduxATYR6RSGGQ0MM/rnamI6xq06g8t30yGTAWNC6uDdrr4wNNC5kQT0UtTdKHx+9HOsOb8GKqECALR2b42P236MXnV7MdQQkU5gmNHAMKOfcvILMWvLBaw+cQsA0MzDBt8NbAI3axOJK6s+OAAfEekyhhkNDDP6beu5eHz01zmk5xbAysQIX/QPRLcGzlKXVa3Ep8fj6+NfY/GpxcjIywBQNDjfh60/xPDGw2FsyHZLRFT1GGY0MMzov1tJWRj/RwTO3U4FAIxo7YmPe/pDaWggcWXVS/EAfN8c/wZJ2UkAOAAfEUmHYUYDw0z1kFegwvwdl7DiyA0AQICbJb4f2BRe9mYSV1b9PG4Avrdbvg07UzuJKySimoBhRgPDTPWy79JdvPfnWTzIyoeZwgBz+jVE38ZuUpdVLRUPwPf50c9xJekKAMDMyAxjm4/Fu8Hvwtmcl/uIqPIwzGhgmKl+4lOz8fYfkTh5MxkA8EqQO2Y81wAmCl52qgzFA/DNOTwHZxLOAACMDY0xqukofND6A7hbuUtcIRFVRwwzGhhmqqeCQhW+2/sPvt9/FUIAdR3NsXBQU/g5s11HZRFCYPvV7fj00Kc4fvs4gKJRhUc0HoFJbSahjm0diSskouqEYUYDw0z1duzqfby9NhL30nOhNJRjxnMNMKC5O8dKqURCCOy/uR+fHfoM+2/uBwAYyAwwqOEgfNz2Y9RzqCdxhURUHTDMaGCYqf7uZ+TinbWROPzPfQBAn0aumPNCACyMjSSurPo7eusoZh+eje1XtwP4b6qET9p9gsbOjaUtjoj0GsOMBoaZmkGlElh66Dq+3HUZhSoBDztTfD+wCQJrWUtdWo1wOu40Zh+ejQ2XNqiX9fbtjSntpqBlrZYSVkZE+ophRgPDTM1yOuYB3vrjDO6kZMPIQIaPetTDyDaevOxURc4nnsecw3OwNnqteqqELt5dMKXdFIR4hkhcHRHpE4YZDQwzNU9qVj4+/OssdkbfBQB0qeeEL/oHwsZMIXFlNceVpCv4/Mjn+O3cbyhQFQAA2tZuiyntpqBbnW4Ml0T0RAwzGhhmaiYhBH4Ni8HsrReRV6iCi5UxvhvYBM09baUurUa5mXIT84/Ox49nfkReYR4AIMg1CFPaTUEfvz6Qyzh5KBGVjmFGA8NMzXb+Tiom/HEGN+5nwkAuwztd6mJMBx8YyHlmoCrFpcfhy2NfYsmpJcguyAYANHRsiMntJqN//f4wkHOMICLSxjCjgWGGMnILMGVDFDZGxgEA2vrYY8ErjeBowQkUq1piZiK+Of4NFp5ciPS8dACAr50vPmn7CQY1HAQjA/ZAI6IiDDMaGGYIKLrstO70bUzfFI3s/ELYmyvw9SuN0a6ug9Sl1UgPsh/g+5Pf45vj3+BBzgMAgKe1Jz5q8xFGNB4BpaFS4gqJSGoMMxoYZkjT1cR0jFt1BpfvpkMmA8aE1MFbnevC2IiXOaSQnpuOxacW46uwr5CYmQgAcLVwxYetP8SoZqNgamQqcYVEJBWGGQ0MM/SwnPxCzNpyAatP3AIA1LIxwYeh/ugT6MJeNhLJys/CiogVmH90Pu6k3wEAOJg64L3g9zCm+RhYKvnZJappGGY0MMzQo2yLisfMv6NxNy0XANDY3RpTetVDEHs8SSa3IBe/nP0Fc4/Mxc2UmwAAG2MbvN3ybUxoOQG2JnxtiGoKhhkNDDP0OFl5BVh+6AaWHrqGrLxCAECPAGd81MMfHnZmEldXc+UX5uOP839gzuE5uJx0GQBgobDAuObj8E7wO3A0c5S4QiKqbAwzGhhmqCwS03KwYPcV/HkqFioBGBnIMCzYExM6+cDalIPtSaVQVYi/Lv6F2Ydn49zdcwAAE0MTvNnsTbzf+n24WbpJXCERVRaGGQ0MM/Q0LiWkYc62Szh05R4AwMrECBM6+WBYsCcUhhzgTSoqocKWK1vw2aHPEB4XDgBQGCjwauNXManNJHjZeElcIRFVNIYZDQwzVB4Hr9zDnK0Xcflu0VgoHnam+CjUH6EBzmwkLCEhBHZf343PDn2Gw7cOAwDkMjlerPci3gt+j5NaElUjDDMaGGaovApVAutOxeKr3VdwL72okXCQhw0m96qHJrVtJK6ODsUcwuzDs7Hr2i71stburfFuq3fxvP/zHFWYSM8xzGhgmKFnlZlbgKWHrmPZoWvIyS+aCbp3oAsmhfrD3ZbjoEjt3N1z+Pr411h1bhXyVfkAAC9rL7zd8m2MbDISFkoLiSskovJgmNHAMEMVJSE1B1/uuoy/Im5DCEBhIMerbTwxtqMPrEw4DL/U4tPj8UP4D1h8ajGSspMAAJZKS7zR9A1MaDkBta1qS1whET0NhhkNDDNU0aLjUjFn20UcvVr0g2ljaoS3O9fF4FYeMDJgI2GpZeVn4bezv+Hr41+ru3UbyAzwcoOX8W7wuwhyDZK4QiIqC4YZDQwzVBmEENh/ORFztl3C1cQMAICXvRk+6uGPbvWd2EhYB6iECtv/2Y6vwr7C/pv71cvb1W6Hd4PfRR/fPmxXQ6TDGGY0MMxQZSooVGFNeCy+3n0FSZl5AICWXraY3KseAmtZS1scqZ2JP4Ovj3+NP87/gQJVAQCgjk0dTGw1ESMaj4C5wlziConoYQwzGhhmqCqk5+RjycFrWHH4BnILihoJv9DEDe9394ObtYnE1VGxO2l3sCh8EZacWqKerdva2BpvNnsT41uMRy3LWhJXSETFGGY0MMxQVbqTko0vd17GhjNFkyUqDeV4ra0XxnSoAwtjNhLWFZl5mfjl7C/4+vjXuJp8FQBgKDfEgIABeKfVO2jq0lTiComIYUYDwwxJIep2Kj7begEnbiQDAOzMFJjY1RcDm7vDkI2EdUahqhBbrmzBguMLcCjmkHp5B88OeLfVu+jl2wtyGV8vIikwzGhgmCGpCCGw+8JdfL79Eq7fzwQA+Dia45Oe/ujo58hGwjrmVNwpfH38a6w9vxaFomjS0bq2dfFOq3cwvPFwmBpxTCGiqsQwo4FhhqSWX6jC6hO38M2eK3iQVTSoW+s6dpjcqx4auFpJXB09LDY1FgtPLsTS00uRmpsKALA1scXoZqMxvsV4uFi4SFwhUc3AMKOBYYZ0RVpOPhbtv4qVR24ir1AFmQx4sWktvN/ND85WxlKXRw/JyMvAyjMr8c2Jb3D9wXUAgJHcCIMaDsI7rd5BI+dGEldIVL0xzGhgmCFdE5uchS92Xsbms3EAAGMjOd5o5403Q+rATGkocXX0sEJVITZd3oQFYQtwNPaoenlnr854N/hdhPqEsl0NUSVgmNHAMEO66sytB5i99SJOxRR1EbY3V+K9br54OcgdBnK2p9FFJ26fwNfHv8b/LvxP3a7G394f77R6B0MDh8LEiN3wiSoKw4wGhhnSZUII7IxOwNztlxCTlAUA8HOywAfd/dDJ3xFyhhqdFJMSg+9Pfo/lEcuRlpsGALA3tcfYoLEY23wsnMydJK6QSP8xzGhgmCF9kFegwm/HY/Dd3n+Qml3USNjbwQyvtfXCi01rwdiIw+7rorTcNPx05id8c/wbxKTGAAAUBgoMaTgE41qM43g1RM+AYUYDwwzpk9SsfCw+eA2rTsQgPado2H1bMwWGtKyNocGecLBQSlwhlaZAVYANFzfgq7CvcOLOCfXypi5NMarpKAxqOAiWSn7/ED0NhhkNDDOkjzJyC/BneCx+OnoDtx9kAwAUBnI838QVr7fzhq+ThcQV0qOExYbh2xPfYsOlDcgrLJqvy9TIFK80eAWjmo5Cq1qtOMYQURkwzGhgmCF9VlCowq4Ld7H88HWcuZWiXt7e1wGj2nmhrY89fxh11P2s+/j17K9YHrEcl+5fUi9v4NAAo5qOwtBGQ2FrYithhUS6jWFGA8MMVRenYx5gxeHr2BmdANW/n1p/Zwu81tYLzzV2hdKQ7Wp0kRACR2OPYnnEcqyLXofsgqIzbUoDJV6s/yJGNR2FEI8QhlKihzDMaGCYoermVlIWfjp6A3+eikVWXlH3YAcLJYYHe2BwSw/YmCkkrpAeJSUnBaujVmN5xHJEJkSql9e1rYvXm76O4Y2GsycU0b8YZjQwzFB1lZqdjz9O3sLPR28iIS0HQNEAfP2b1cLINl7wdjCXuEJ6FCEETsefxvLTy7H6/Gpk5GUAKJq5u69fX4xqOgpd63TlYHxUozHMaGCYoeouv1CFrefisfzwdUTHFY15IpMBnf0d8Xo7b7T0suUlDB2WkZeBtefXYnnEcq2eUB5WHnityWt4tcmrqGVZS8IKiaTBMKOBYYZqCiEEjl9Pxo9HrmPPxUT18oZuVni9nRd6NnSBkQH/p6/Lzt09hxURK/Dbud+QkpMCAJDL5OhZtydGNR2FnnV7wlDOKS+oZmCY0cAwQzXRtXsZ+OnIDfzv9G3kFqgAAC5WxhjR2hMDWtSGlYmRxBXS42TnZ+Ovi39h2ellOHzrsHq5q4UrXm38Kl5r8hq8bLwkrJCo8ulNmJk7dy7Wr1+PS5cuwcTEBK1bt8a8efPg5+en3mbEiBH45ZdftO7XsmVLHD9+vEzHYJihmiw5Mw+rjsfgl7AY3M/IBQCYKQzwcnN3jGzjBXdbU4krpCe5dP8SVkSswC9nf8H9rPvq5V29u2JU01Ho698XCgM2+qbqR2/CTGhoKAYMGIDmzZujoKAAkydPRlRUFC5cuAAzMzMARWHm7t27WLlypfp+CoUCtrZlG5+BYYYIyMkvxOazcfjx8A1cvpsOAJDLgNAAZ7zW1hvNPGwkrpCeJLcgF5sub8LyiOXYc32PermDqQOGNxqO15u+Dj97v8fsgUi/6E2Yedi9e/fg6OiIgwcPon379gCKwkxKSgo2btxYrn0yzBD9RwiBw//cx/LD13H4n//+l9+0tjVeb+eN7g2cOWO3Hrj+4Dp+jPgRKyNXIj4jXr28vUd7jGo6Ci/We5EzeJPe09swc/XqVdStWxdRUVEICAgAUBRmNm7cCIVCAWtra4SEhGD27NlwdHQs0z4ZZohKdzkhHSsOX8emyDjkFRa1q3G3NcGrrb3wcnN3mCvZ0FTXFagKsO2fbVgesRzb/tkGlSh6Ha2NrTE0cChGNR2Fhk4NJa6SqHz0MswIIdC3b188ePAAhw//1+Bt7dq1MDc3h4eHB27cuIGpU6eioKAAp0+fhlJZctK93Nxc5Obmqv9OS0uDu7s7wwzRIySm5+C3sBj8fjwGD7KKZuy2MDbEoBa1MaKNJ1ys+D98fXA77TZWnlmJH8/8qJ7BGwBaurXEqKaj8ErAKzBXcOwh0h96GWbGjRuHrVu34siRI6hV69FjKsTHx8PDwwNr1qxBv379SqyfMWMGZs6cWWI5wwzR42XnFeKviNv46cgNXL+fCQAwlMvQK9AFo9p5I8DNSuIKqSwKVYXYc30Plkcsx6bLm1CgKpp93VxhjkEBgzCuxTgEOgVKXCXRk+ldmJkwYQI2btyIQ4cOwcvryd0N69ati9dffx2TJk0qsY5nZoiejUolsO9SIlYcuY7j15PVy1t62WJAC3d0b+AMUwUvQemDuxl38cvZX7A8YjmuJl9VL29Xux3GtxiPF/xfgJEBu+mTbtKbMCOEwIQJE7BhwwYcOHAAdevWfeJ9kpKS4ObmhmXLlmHYsGFP3J5tZojK7/ydVKw4fB1bzsWj4N/ZLU0VBght4IwXmrqhdR17NhjWA0IIHLh5AItPLcb6i+tRKIrm9HIxd8Gbzd7EG83egIuFi8RVEmnTmzAzduxYrF69Gps2bdIaW8bKygomJibIyMjAjBkz8OKLL8LFxQU3b97EJ598glu3buHixYuwsLB44jEYZoieXXxqNtacjMXGyDuIScpSL3eyVOL5xm54oakb/J35+dIHd9LuYNnpZVh6einuZt4FUDQn1Iv1XsT4FuPRxr0Np78gnaA3YeZRH5iVK1dixIgRyM7OxvPPP48zZ84gJSUFLi4u6NixIz799FO4u7uX6RgMM0QVRwiBiFsPsD7iDraci0dqdr56XT0XS/Rr4oa+jV3haGksYZVUFnmFeVh/cT0WnlyIo7FH1csbOTXCuObjMKjhIJgpzCSskGo6vQkzVYFhhqhy5BYUYv+le9hw5jb2XUpEfmHRV4lcBrTxsUe/pm5sX6MnIhMisejkIqyKWoXsgmwARd27RzYeiTHNx8DH1kfiCqkmYpjRwDBDVPlSsvKw5Vw8Npy5g9MxD9TL2b5GvyRnJ2PlmZX44dQPuP7gOgBABhlCfUIxvsV4hPqEQi7jZKVUNRhmNDDMEFWtmKRMbDhzBxvOlGxf07exG15o4oZ6Lvws6jKVUGHH1R1YeHIhtl/drl7ubeONsUFj8WqTV2FrUrYpZYjKi2FGA8MMkTSK2tekYH3E7RLta/ydLdCvqRv6NnaDE9vX6LSryVexOHwxfor8CSk5KQAAE0MTDGo4CONbjEdj58aS1kfVF8OMBoYZIuk9qX3NC02K2teYcQoFnZWVn4XVUaux8ORCnL17Vr28jXsbjGs+Di/Wf5Gzd1OFYpjRwDBDpFse176mewNnvNDEDW182L5GVwkhcCz2GBaGL8T/LvxPPcKwk5mTeswaN0s3iauk6oBhRgPDDJHuelT7GkcLJZ5vwvY1ui4+PV49Zk3x7N0GMgP0q9cP41uMR7va7ThmDZUbw4wGhhki3VfcvmbDmaL2NSlZbF+jT/IL87Hh0gYsCl+EQzGH1MsbOjbEuObjMDhwMCe5pKfGMKOBYYZIv+QVqLD/ciI2RNzBvkuJyCtUAWD7Gn1x7u45LDq5CL9H/Y6s/KKzbVZKK4xoPAJjm4+Fr52vxBWSvmCY0cAwQ6S/UrLysDUqHhsi7uCURvsaEyMDhAawfY0uS8lJwc+RP2NR+CKtSS671+mO8S3Go4dPDxjIDSSskHQdw4wGhhmi6uFWUta/7Wtu46ZG+xp7cyW6NXBCaANnBNexg5EBB3XTJSqhwu5ru7EwfCG2XtkKgaKfHE9rT4wNGouRTUbCztRO4ipJFzHMaGCYIapehBA4E5uCDRF38Pe5OK32NZbGhuhSzwmhAc5o7+sAYyP+z1+XXH9wHUtOLcGKiBV4kFN0ps3Y0BjP+T2Hvn590bNuT1gbW0tbJOkMhhkNDDNE1VdegQrHrydhR3QCdkUn4H5GnnqdiZEBOvo7oHsDZ3Tyd4SFsZGElZKmrPwsrDm/BgtPLsSZhDPq5YZyQ4R4hOB5/+fxnN9zqG1VW8IqSWoMMxoYZohqhkKVwOmYB9hxPgE7oxNwJyVbvU5hIEcbHzv0CHBBl/pOsDXj4G66QAiB0/GnseHiBmy8vBEX7l3QWt/EuQn6+vVFX/++aOTUiN28axiGGQ0MM0Q1jxAC5++kYUd0PLafT8D1e5nqdXIZ0NLLDqEBzujewBnOVuzurSuuJl/FpkubsPHyRhyLPQaVUKnXeVh5qINNu9rtYGTAM23VHcOMBoYZIvrnbjp2nE/AjugERMelaa1rUtsaoQ2Kgo2nvZlEFdLD7mXew5YrW7Dp8ibsurYL2QX/nWmzMbZBz7o98bz/8+hepzsslBYSVkqVhWFGA8MMEWmKTc7CzugEbD+foDWdAlA0QF9ogDNCA5zh52TByxo6Iis/C7uv7camy5vw95W/cT/rvnqdwkCBzl6d0devL57zew4uFi4SVkoViWFGA8MMET1KYloOdl64i53nExB2PQmFqv++Dj3tTBEa4ILQAGc0qmXFYKMjClWFCLsdpr4cpTmGDQC0dGupvhxVz74eXzc9xjCjgWGGiMoiJSsPey4mYsf5eBz65z7yCv5rr+FiZYzu/16Kau5pA0OOZaMThBC4eP8iNl3ahE2XN+HEnRNa631sfdDXry+e938ewbWCOUifnmGY0cAwQ0RPKyO3AAcuJ2LH+QTsv5SIzLxC9TpbMwW61XdC9wBntK5jB6UhfyB1RVx6HP6+/Dc2Xd6EvTf2Iq/wv676DqYO6O3bG339+qJrna4wNTKVsFIqC4YZDQwzRPQscvILcfTqfWw/n4A9F+9qDdJnoTREp3qOCG3gjBA/B5gqOF+UrkjPTcfOazux8dJGbP1nK1JyUtTrTAxN0K1ON/T164vevr3hYOYgXaH0SAwzGhhmiKii5BeqcPJGsnosm8T0XPU6YyM5QnwdEBrgjE7+TrAyYddhXZFfmI/Dtw6rL0fFpMao18llcrR2b13UzsavL+ra1ZWwUtLEMKOBYYaIKoNKVTStwo7z8dgRnYDY5P+6DhvKZWjtY4/QBs7o1sAJ9uZKCSslTUIInL17Vh1sNEcgBoD6DvXV7WyauzZnA2IJMcxoYJghosomhMCF+DTs/Hcsmyt3M9Tr5DIgyNO2aCybAGe4WZtIWCk9LCYlBpsvb8amy5twMOYgClQF6nU+tj4Y0nAIhjYaCm8bbwmrrJkYZjQwzBBRVbt2L0N9Kerc7VStdY1qWaF7gDNCGzjD28FcogqpNA+yH2D71e3YdHkTtlzZgqz8/2Znb+PeBkMDh+LlBi/DxsRGwiprDoYZDQwzRCSl2w+ysDO6aCyb8JhkaH7j+jlZqINNPRcO0qdLMvIysOHiBvx27jfsvbFXPbWCwkCB3r69MTRwKHrW7QmFAef5qiwMMxoYZohIV9xLz8XuC3ex/Xw8wq4loUBjkD4PO1P1pajGtawhlzPY6Iq49DisjlqNX8/+iqjEKPVyOxM7vNLgFQxtNBQt3VoyjFYwhhkNDDNEpItSs/Kx99JdbD+fgENX7iFXY5A+Z0tjdG9QNJZNC09bDtKnQ84mnMVv537DqqhVSMhIUC+va1sXQwOHYkjgEHjZeElYYfXBMKOBYYaIdF1mbgEOXrmHHecTsO9SIjJy/2uEamumQNd6TggNcEZrHw7SpysKVAXYe30vfjv3GzZc2qDVvqZt7bYYGjgUL9V/ie1rngHDjAaGGSLSJzn5hTh27T52nE/A7gt38UBjkD5zpSE6+TuiRwAH6dMlGXkZWH9xfVH7mut7IVD0s6owUKCPbx8MazQMoT6hbF/zlBhmNDDMEJG+KigepC86ATvOaw/SpzQsGqSvR0MO0qdLbqfdxuqo1fjt3G84n3hevdzOxA4DAgZgaOBQtHBrwfY1ZcAwo4Fhhoiqg+JB+nZGJ2D7+fhSB+nrEeCMrvU5SJ8uKB6c77ezv2H1+dVa7Wt87XwxpOEQtq95AoYZDQwzRFTdlHWQvtAAZ7hykD7JFagKsOf6nqL2NRc3ILvgvyDarna7ovY1DV6CtbG1dEXqIIYZDQwzRFTdcZA+/ZGem65uX7Pvxj51+xqlgRJ9/PpgWGBR+xojA142ZJjRwDBDRDXJ4wbp83UyR0d/R3T0c0QzDxsYscu3pG6n3caqc6vw27nfEH0vWr3c3tQeAxoMwNBGQ2v0/FAMMxoYZoiopnrcIH0WxoZoX9cBHf0dEeLrAAcLtrORihACkQmR+O3cb1gdtRp3M++q1/nZ+WFo4FAMDhwMT2tP6YqUAMOMBoYZIqKiQfoO/nMP+y8l4uCVe0jOzNNa36iWFTr4OaKjvyMC3aw4ArFEClQF2H1tN3479xs2Xtqo1b6mvUd7vFz/ZXSr0w0+tj7V/owNw4wGhhkiIm2FKoFzt1Ow/1Ii9l++h6g72u1s7MwUCPFzQCd/R7Sr68Bu3xJJy01Tt6/Zf2O/un0NAHhae6Kbdzd0q9MNnbw6VcvB+RhmNDDMEBE9XmJaDg5cKTprc/if+1ojEBvIZWhW26aorY2/A/ycOCGmFGJTY/HH+T+w/ep2HL11FPmq/wZTlMvkaO7aHN3qdENX765oVatVtWhAzDCjgWGGiKjs8gpUOB3zAPsvJ2L/pUT8k5ihtd7Vyhgd/B3Ryc8RrX3sOAqxBDLyMnDw5kHsvr4bu67twsX7F7XWWygs0MGzA7rVKTpzU9e2rl4GUIYZDQwzRETlF5uchQOXiy5HHb16X2tCTIWBHC29bdHp3x5SnvZmElZac91Ou43d13Zj1/Vd2H1tN5Kyk7TW17aqrb4k1dm7M2xNbCWq9OkwzGhgmCEiqhg5+YUIu56E/ZcSse9SIm4/yNZa721vhg5+jujk74jmXjacFFMCKqFCZEIkdl3bhV3XduFo7FHkFf7X2FsGGYJcg9RnbVrVaqWzc0YxzGhgmCEiqnhCCFy7l/lvI+JEnLyRrNX121RhgDY+9ujk74gOfg5wseJIxFLIzMvEoZhD2HVtF3Zf3601ng0AmBmZoaNXR3Tz7oaudbrCz85PZy5JMcxoYJghIqp86Tn5OHr1Pvb920PqnsakmABQz8USHf/tIdXY3RqGHLBPEnfS7mD39d1Ft2u7cS/rntZ6d0t3dUPizt6dYW9qL1GlDDNaGGaIiKqWSlU0d1TxWZszsSlaIxFbmRihva8DOvk7oH1dB9hxYkxJqIQKZxPOqhsSH751uMQlqWauzdDVuyu61emG1u6tq/SSFMOMBoYZIiJpJWfm4dCVe9h/uWjAvpSs/7oVy2RAgKsVWnnbIriOHZp72sLCWP+7FeujrPwsHI45XNTe5vounE88r7XezMgMIZ4h6sbE/vb+lXpJimFGA8MMEZHuKFQJRMY+wP5L97DvUiIuxKdprZfLgIZuVmhVxw6tvIvCjbmS3b+lEJcehz3X96jb2yRmJmqtd7NwU1+S6uLdBQ5mDhV6fIYZDQwzRES6KzEtB2HXk3D8ehLCriXhZlKW1noDuQwN3awQXMcOwd52CPK04dg2ElAJFaLuRqmDzaGYQ8gt/K9d1PBGw/Hz8z9X6DEZZjQwzBAR6Y/41Gx1sDl+PRm3krXDjaFchkbu1kWXpbzt0czDBiYKdgGvatn52Th867B6fJuP2nyEgQ0HVugxGGY0MMwQEemvOynZOH4tCWH/Bpw7Kdpj2xgZyNDY3RrB3kWXpZp62MDYiOGmqgkhKrz9DMOMBoYZIqLqIzY5S31Z6vi1JMSl5mitVxjI0bj2f+GmSW1rhhs9xTCjgWGGiKh6EkLgVnKW+rJU2PUk3E3THt9GYShHs9o2aOVth+A6dmjkbsWRifUEw4wGhhkioppBCIGbSdrh5uHB+4yN5GjmYYNWXkXhJrCWNRSGHMBPF+lNmJk7dy7Wr1+PS5cuwcTEBK1bt8a8efPg5+en3kYIgZkzZ2LZsmV48OABWrZsiUWLFqFBgwZlOgbDDBFRzSSEwPX7mf82Ji663c/I09rGxMgAQZ5FZ25aedshsJYVjDg6sU7QmzATGhqKAQMGoHnz5igoKMDkyZMRFRWFCxcuwMysaPbVefPmYfbs2fj555/h6+uLzz77DIcOHcLly5dhYWHxxGMwzBAREVAUbq4mZhSduble1FsqOVM73JgqDBDkaftvmxtbNHSz4tQLEtGbMPOwe/fuwdHREQcPHkT79u0hhICrqysmTpyISZMmAQByc3Ph5OSEefPm4c0333ziPhlmiIioNCqVwD/F4eZaEo7fSNIanRgAzJWGaO5p8+84N/ao72oJA7luTMRY3T3N77dOjTyUmpoKALC1tQUA3LhxAwkJCejWrZt6G6VSiZCQEBw7dqxMYYaIiKg0crkMfs4W8HO2wPDWnlCpBC7fTVe3tzlxPQlpOQXYf/ke9l8umpDR0tgQLf5tb9PK2xb1nC0hZ7iRnM6EGSEE3n33XbRt2xYBAQEAgISEBACAk5OT1rZOTk6IiYkpdT+5ubnIzf2vwVdaWlqp2xEREWmSy2Wo52KJei6WGNnWC4UqgYvxaeozNydvJCMtpwB7Lt7Fnot3AQDWpkZo6VV0WSq4jj18ncwrdb4iKp3OhJnx48fj3LlzOHLkSIl1D78xHjc4z9y5czFz5sxKqZGIiGoOA7kMAW5WCHCzwuvtvFFQqEJ0XJp6AL/wm8lIycrHzui72BldFG7szBRFjYn/nX6hjoMZw00V0Ik2MxMmTMDGjRtx6NAheHl5qZdfv34dderUQUREBJo0aaJe3rdvX1hbW+OXX34psa/Szsy4u7uzzQwREVWo/EIVou6kqntLhd9MRk6+SmsbBwtl0Rg3/45z42lnynBTRnrTZkYIgQkTJmDDhg04cOCAVpABAC8vLzg7O2P37t3qMJOXl4eDBw9i3rx5pe5TqVRCqVRWeu1ERFSzGRnI0bS2DZrWtsG4jj7IK1Dh7O2UojY315Jw+tYD3EvPxd9n4/D32TgAgLOlsbq9TbC3PdxtTRhuKoCkZ2bGjh2L1atXY9OmTVpjy1hZWcHExARAUdfsuXPnYuXKlahbty7mzJmDAwcOsGs2ERHptJz8QkTGpqgbFEfeSkFeofaZGzdrE/XoxMF17OBmbSJRtbpHb7pmPyqNrly5EiNGjADw36B5S5cu1Ro0r7iR8JMwzBARkS7IzitExK0H6stSkbEpKFBp/wTXtjVVX5Jq5W0HZytjiaqVnt6EmarAMENERLooK68Ap24+UDcojrqTisKHwo2XvZn6zE0rb1s4WtSccMMwo4FhhoiI9EF6Tj5O3XygHqH4/J1UPJRt4O1ghhaetgjytEVzTxvUtq2+DYoZZjQwzBARkT5Kzc5H+I1k9ZmbiwlpePgX28FCieaeNmjuaYvmnrbwd7aoNtMvMMxoYJghIqLqICUrD6djHiD85gOE30zGudspyC/U/gk3UxigqYcNgjxs0dzLBo3drWGq0Jkh5Z4Kw4wGhhkiIqqOcvILce52KsJvJuPUzWScinmA9JwCrW0M5TI0cLNCcw8bBHnaIsjTBvbm+jF8CcOMBoYZIiKqCYrnljp1M1l99iY+NafEdt4OZmjuURRsmnvawkNHB/JjmNHAMENERDXVnZRshN9I/vfszQNcvpteYpvidjdBHkXtbuq56Ea7G4YZDQwzRERERVKy8hBx6wFO3niAUzeTce52aomB/EwVBmha2wZBnjZo4WmLxrWlaXfDMKOBYYaIiKh0OfmFiLpT1O4m/Ebp7W4M5DIEuFqqu4M387CFg0Xlt7thmNHAMENERFQ2KpXAlcR0hN8sOnMTfiMZcaW1u7E3Q5Cnzb8Bx7ZSJtBkmNHAMENERFR+d1Ky/21U/F+7m4eTw8AW7pjbL7BCj6s3s2YTERGRbnOzNoFbYzf0bewGAEjNysfpW8nqszdnY1NR30XakwUMM0RERFRmVqZG6OTvhE7+TgCK2t2oJL7IwzBDRERE5WZsZCB1CZC+IzkRERHRM2CYISIiIr3GMENERER6jWGGiIiI9BrDDBEREek1hhkiIiLSawwzREREpNcYZoiIiEivMcwQERGRXmOYISIiIr3GMENERER6jWGGiIiI9BrDDBEREem1aj9rtvh3WvK0tDSJKyEiIqKyKv7dLv4df5xqH2bS09MBAO7u7hJXQkRERE8rPT0dVlZWj91GJsoSefSYSqVCXFwcLCwsIJPJKnTfaWlpcHd3R2xsLCwtLSt039UNn6uy43NVdnyuyo7PVdnxuSq7ynyuhBBIT0+Hq6sr5PLHt4qp9mdm5HI5atWqVanHsLS05Bu+jPhclR2fq7Ljc1V2fK7Kjs9V2VXWc/WkMzLF2ACYiIiI9BrDDBEREek1hplnoFQqMX36dCiVSqlL0Xl8rsqOz1XZ8bkqOz5XZcfnqux05bmq9g2AiYiIqHrjmRkiIiLSawwzREREpNcYZoiIiEivMcwQERGRXmOYKacffvgBXl5eMDY2RrNmzXD48GGpS9I5c+fORfPmzWFhYQFHR0c8//zzuHz5stRl6YW5c+dCJpNh4sSJUpeis+7cuYMhQ4bAzs4OpqamaNy4MU6fPi11WTqnoKAAU6ZMgZeXF0xMTODt7Y1Zs2ZBpVJJXZrkDh06hD59+sDV1RUymQwbN27UWi+EwIwZM+Dq6goTExN06NAB0dHR0hQrscc9V/n5+Zg0aRIaNmwIMzMzuLq6YtiwYYiLi6uy+hhmymHt2rWYOHEiJk+ejDNnzqBdu3bo0aMHbt26JXVpOuXgwYMYN24cjh8/jt27d6OgoADdunVDZmam1KXptPDwcCxbtgyBgYFSl6KzHjx4gDZt2sDIyAjbt2/HhQsX8NVXX8Ha2lrq0nTOvHnzsGTJEixcuBAXL17E/Pnz8cUXX+D777+XujTJZWZmolGjRli4cGGp6+fPn48FCxZg4cKFCA8Ph7OzM7p27aqe868medxzlZWVhYiICEydOhURERFYv349rly5gueee67qChT01Fq0aCFGjx6ttczf31989NFHElWkHxITEwUAcfDgQalL0Vnp6emibt26Yvfu3SIkJES8/fbbUpekkyZNmiTatm0rdRl6oVevXmLkyJFay/r16yeGDBkiUUW6CYDYsGGD+m+VSiWcnZ3F559/rl6Wk5MjrKysxJIlSySoUHc8/FyV5uTJkwKAiImJqZKaeGbmKeXl5eH06dPo1q2b1vJu3brh2LFjElWlH1JTUwEAtra2Eleiu8aNG4devXqhS5cuUpei0zZv3oygoCC89NJLcHR0RJMmTbB8+XKpy9JJbdu2xd69e3HlyhUAwNmzZ3HkyBH07NlT4sp0240bN5CQkKD1Xa9UKhESEsLv+jJITU2FTCarsrOl1X6iyYp2//59FBYWwsnJSWu5k5MTEhISJKpK9wkh8O6776Jt27YICAiQuhydtGbNGkRERCA8PFzqUnTe9evXsXjxYrz77rv45JNPcPLkSbz11ltQKpUYNmyY1OXplEmTJiE1NRX+/v4wMDBAYWEhZs+ejYEDB0pdmk4r/j4v7bs+JiZGipL0Rk5ODj766CMMGjSoyibqZJgpJ5lMpvW3EKLEMvrP+PHjce7cORw5ckTqUnRSbGws3n77bezatQvGxsZSl6PzVCoVgoKCMGfOHABAkyZNEB0djcWLFzPMPGTt2rX4/fffsXr1ajRo0ACRkZGYOHEiXF1dMXz4cKnL03n8rn86+fn5GDBgAFQqFX744YcqOy7DzFOyt7eHgYFBibMwiYmJJRI8FZkwYQI2b96MQ4cOoVatWlKXo5NOnz6NxMRENGvWTL2ssLAQhw4dwsKFC5GbmwsDAwMJK9QtLi4uqF+/vtayevXq4a+//pKoIt31wQcf4KOPPsKAAQMAAA0bNkRMTAzmzp3LMPMYzs7OAIrO0Li4uKiX87v+0fLz8/Hyyy/jxo0b2LdvX5WdlQHYm+mpKRQKNGvWDLt379Zavnv3brRu3VqiqnSTEALjx4/H+vXrsW/fPnh5eUldks7q3LkzoqKiEBkZqb4FBQVh8ODBiIyMZJB5SJs2bUp0879y5Qo8PDwkqkh3ZWVlQS7X/qo3MDBg1+wn8PLygrOzs9Z3fV5eHg4ePMjv+lIUB5l//vkHe/bsgZ2dXZUen2dmyuHdd9/F0KFDERQUhODgYCxbtgy3bt3C6NGjpS5Np4wbNw6rV6/Gpk2bYGFhoT6bZWVlBRMTE4mr0y0WFhYl2hKZmZnBzs6ObYxK8c4776B169aYM2cOXn75ZZw8eRLLli3DsmXLpC5N5/Tp0wezZ89G7dq10aBBA5w5cwYLFizAyJEjpS5NchkZGbh69ar67xs3biAyMhK2traoXbs2Jk6ciDlz5qBu3bqoW7cu5syZA1NTUwwaNEjCqqXxuOfK1dUV/fv3R0REBLZs2YLCwkL1972trS0UCkXlF1glfaaqoUWLFgkPDw+hUChE06ZN2d24FABKva1cuVLq0vQCu2Y/3t9//y0CAgKEUqkU/v7+YtmyZVKXpJPS0tLE22+/LWrXri2MjY2Ft7e3mDx5ssjNzZW6NMnt37+/1O+o4cOHCyGKumdPnz5dODs7C6VSKdq3by+ioqKkLVoij3uubty48cjv+/3791dJfTIhhKj8yERERERUOdhmhoiIiPQawwwRERHpNYYZIiIi0msMM0RERKTXGGaIiIhIrzHMEBERkV5jmCEiIiK9xjBDRDWOTCbDxo0bpS6DiCoIwwwRVakRI0ZAJpOVuIWGhkpdGhHpKc7NRERVLjQ0FCtXrtRaplQqJaqGiPQdz8wQUZVTKpVwdnbWutnY2AAougS0ePFi9OjRAyYmJvDy8sK6deu07h8VFYVOnTrBxMQEdnZ2eOONN5CRkaG1zU8//YQGDRpAqVTCxcUF48eP11p///59vPDCCzA1NUXdunWxefPmyn3QRFRpGGaISOdMnToVL774Is6ePYshQ4Zg4MCBuHjxIgAgKysLoaGhsLGxQXh4ONatW4c9e/ZohZXFixdj3LhxeOONNxAVFYXNmzfDx8dH6xgzZ87Eyy+/jHPnzqFnz54YPHgwkpOTq/RxElEFqZLpLImI/jV8+HBhYGAgzMzMtG6zZs0SQhTNtj569Git+7Rs2VKMGTNGCCHEsmXLhI2NjcjIyFCv37p1q5DL5SIhIUEIIYSrq6uYPHnyI2sAIKZMmaL+OyMjQ8hkMrF9+/YKe5xEVHXYZoaIqlzHjh2xePFirWW2trbqfwcHB2utCw4ORmRkJADg4sWLaNSoEczMzNTr27RpA5VKhcuXL0MmkyEuLg6dO3d+bA2BgYHqf5uZmcHCwgKJiYnlfUhEJCGGGSKqcmZmZiUu+zyJTCYDAAgh1P8ubRsTE5My7c/IyKjEfVUq1VPVRES6gW1miEjnHD9+vMTf/v7+AID69esjMjISmZmZ6vVHjx6FXC6Hr68vLCws4Onpib1791ZpzUQkHZ6ZIaIql5ubi4SEBK1lhoaGsLe3BwCsW7cOQUFBaNu2LVatWoWTJ0/ixx9/BAAMHjwY06dPx/DhwzFjxgzcu3cPEyZMwNChQ+Hk5AQAmDFjBkaPHg1HR0f06NED6enpOHr0KCZMmFC1D5SIqgTDDBFVuR07dsDFxUVrmZ+fHy5dugSgqKfRmjVrMHbsWDg7O2PVqlWoX78+AMDU1BQ7d+7E22+/jebNm8PU1BQvvvgiFixYoN7X8OHDkZOTg6+//hrvv/8+7O3t0b9//6p7gERUpWRCCCF1EURExWQyGTZs2IDnn39e6lKISE+wzQwRERHpNYYZIiIi0mtsM0NEOoVXvonoafHMDBEREek1hhkiIiLSawwzREREpNcYZoiIiEivMcwQERGRXmOYISIiIr3GMENERER6jWGGiIiI9BrDDBEREem1/wNuzRKvFEPTkAAAAABJRU5ErkJggg==", "text/plain": [ "<Figure size 640x480 with 1 Axes>" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "import matplotlib.pyplot as plt\n", "\n", "n_epochs_overfit_1 = 13 #Otherwise len(train_lost_list) < n_epochs\n", "plt.plot(range(n_epochs_overfit), train_loss_list, label = \"Model 0\")\n", "plt.plot(range(n_epochs_overfit), train_loss_list_1, color = \"green\", label = \"Model 1\")\n", "plt.xlabel(\"Epoch\")\n", "plt.ylabel(\"Loss\")\n", "plt.title(\"Comparison of Performande for of Model 0 and Model 1\")\n", "plt.show()" ] }, { "cell_type": "code", "execution_count": 14, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[43.45363824605942, 33.786904842853545, 29.978750259280204, 27.777584496736527, 26.11793281197548, 24.786260991096498, 23.703872640132904, 22.74807582437992, 21.79085268616676, 20.92527386188507, 20.174014331400393, 19.419565526545046, 18.71952503979206]\n", "[45.348057844638824, 39.64908684611321, 35.00802879333496, 32.13809435069561, 30.21873086452484, 28.807953109145163, 27.365781868696214, 26.038266357183456, 24.863524509072302, 23.610995230078696, 22.689530485272407, 21.60567447721958, 20.795099827349187]\n" ] } ], "source": [ "print(train_loss_list)\n", "print(train_loss_list_1)" ] }, { "cell_type": "markdown", "id": "bc381cf4", "metadata": {}, "source": [ "## Exercise 2: Quantization: try to compress the CNN to save space\n", "\n", "Quantization doc is available from https://pytorch.org/docs/stable/quantization.html#torch.quantization.quantize_dynamic\n", " \n", "The Exercise is to quantize post training the above CNN model. Compare the size reduction and the impact on the classification accuracy \n", "\n", "\n", "The size of the model is simply the size of the file." ] }, { "cell_type": "code", "execution_count": 15, "id": "ef623c26", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "model: fp32 \t Size (KB): 251.278\n" ] }, { "data": { "text/plain": [ "251278" ] }, "execution_count": 15, "metadata": {}, "output_type": "execute_result" } ], "source": [ "import os\n", "\n", "\n", "def print_size_of_model(model, label=\"\"):\n", " torch.save(model.state_dict(), \"temp.p\")\n", " size = os.path.getsize(\"temp.p\")\n", " print(\"model: \", label, \" \\t\", \"Size (KB):\", size / 1e3)\n", " os.remove(\"temp.p\")\n", " return size\n", "\n", "\n", "print_size_of_model(model, \"fp32\")" ] }, { "cell_type": "markdown", "id": "05c4e9ad", "metadata": {}, "source": [ "Post training quantization example" ] }, { "cell_type": "code", "execution_count": 16, "id": "c4c65d4b", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "model: int8 \t Size (KB): 76.522\n" ] }, { "data": { "text/plain": [ "76522" ] }, "execution_count": 16, "metadata": {}, "output_type": "execute_result" } ], "source": [ "import torch.quantization\n", "\n", "\n", "quantized_model = torch.quantization.quantize_dynamic(model, dtype=torch.qint8)\n", "print_size_of_model(quantized_model, \"int8\")" ] }, { "cell_type": "markdown", "id": "7b108e17", "metadata": {}, "source": [ "For each class, compare the classification test accuracy of the initial model and the quantized model. Also give the overall test accuracy for both models." ] }, { "cell_type": "markdown", "id": "a0a34b90", "metadata": {}, "source": [ "Try training aware quantization to mitigate the impact on the accuracy (doc available here https://pytorch.org/docs/stable/quantization.html#torch.quantization.quantize_dynamic)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "First for the initial model :" ] }, { "cell_type": "code", "execution_count": 18, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Test Loss: 22.235297\n", "\n", "Test Accuracy of airplane: 52% (523/1000)\n", "Test Accuracy of automobile: 84% (849/1000)\n", "Test Accuracy of bird: 34% (341/1000)\n", "Test Accuracy of cat: 43% (432/1000)\n", "Test Accuracy of deer: 66% (662/1000)\n", "Test Accuracy of dog: 44% (448/1000)\n", "Test Accuracy of frog: 74% (746/1000)\n", "Test Accuracy of horse: 64% (647/1000)\n", "Test Accuracy of ship: 83% (836/1000)\n", "Test Accuracy of truck: 64% (649/1000)\n", "\n", "Test Accuracy (Overall): 61% (6133/10000)\n" ] } ], "source": [ "# import model\n", "model.load_state_dict(torch.load(\"./model_cifar.pt\"))\n", "\n", "# track test loss\n", "test_loss = 0.0\n", "class_correct = list(0.0 for i in range(10))\n", "class_total = list(0.0 for i in range(10))\n", "\n", "model.eval()\n", "# iterate over test data\n", "for data, target in test_loader:\n", " # move tensors to GPU if CUDA is available\n", " if train_on_gpu:\n", " data, target = data.cuda(), target.cuda()\n", " # forward pass: compute predicted outputs by passing inputs to the model\n", " output = model(data)\n", " # calculate the batch loss\n", " loss = criterion(output, target)\n", " # update test loss\n", " test_loss += loss.item() * data.size(0)\n", " # convert output probabilities to predicted class\n", " _, pred = torch.max(output, 1)\n", " # compare predictions to true label\n", " correct_tensor = pred.eq(target.data.view_as(pred))\n", " correct = (\n", " np.squeeze(correct_tensor.numpy())\n", " if not train_on_gpu\n", " else np.squeeze(correct_tensor.cpu().numpy())\n", " )\n", " # calculate test accuracy for each object class\n", " for i in range(batch_size):\n", " label = target.data[i]\n", " class_correct[label] += correct[i].item()\n", " class_total[label] += 1\n", "\n", "# average test loss\n", "test_loss = test_loss / len(test_loader)\n", "print(\"Test Loss: {:.6f}\\n\".format(test_loss))\n", "\n", "for i in range(10):\n", " if class_total[i] > 0:\n", " print(\n", " \"Test Accuracy of %5s: %2d%% (%2d/%2d)\"\n", " % (\n", " classes[i],\n", " 100 * class_correct[i] / class_total[i],\n", " np.sum(class_correct[i]),\n", " np.sum(class_total[i]),\n", " )\n", " )\n", " else:\n", " print(\"Test Accuracy of %5s: N/A (no training examples)\" % (classes[i]))\n", "\n", "print(\n", " \"\\nTest Accuracy (Overall): %2d%% (%2d/%2d)\"\n", " % (\n", " 100.0 * np.sum(class_correct) / np.sum(class_total),\n", " np.sum(class_correct),\n", " np.sum(class_total),\n", " )\n", ")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Then for the quantized model :" ] }, { "cell_type": "code", "execution_count": 19, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Test Loss: 22.242847\n", "\n", "Test Accuracy of airplane: 52% (523/1000)\n", "Test Accuracy of automobile: 85% (853/1000)\n", "Test Accuracy of bird: 34% (342/1000)\n", "Test Accuracy of cat: 43% (430/1000)\n", "Test Accuracy of deer: 66% (660/1000)\n", "Test Accuracy of dog: 45% (452/1000)\n", "Test Accuracy of frog: 74% (749/1000)\n", "Test Accuracy of horse: 64% (649/1000)\n", "Test Accuracy of ship: 83% (835/1000)\n", "Test Accuracy of truck: 64% (645/1000)\n", "\n", "Test Accuracy (Overall): 61% (6138/10000)\n" ] } ], "source": [ "# quantize model\n", "quantized_model = torch.quantization.quantize_dynamic(model, dtype=torch.qint8)\n", "\n", "# track test loss\n", "quantized_test_loss = 0.0\n", "quantized_class_correct = list(0.0 for i in range(10))\n", "quantized_class_total = list(0.0 for i in range(10))\n", "\n", "quantized_model.eval()\n", "# iterate over test data\n", "for data, target in test_loader:\n", " # move tensors to GPU if CUDA is available\n", " if train_on_gpu:\n", " data, target = data.cuda(), target.cuda()\n", " # forward pass: compute predicted outputs by passing inputs to the model\n", " output = quantized_model(data)\n", " # calculate the batch loss\n", " loss = criterion(output, target)\n", " # update test loss\n", " quantized_test_loss += loss.item() * data.size(0)\n", " # convert output probabilities to predicted class\n", " _, pred = torch.max(output, 1)\n", " # compare predictions to true label\n", " correct_tensor = pred.eq(target.data.view_as(pred))\n", " correct = (\n", " np.squeeze(correct_tensor.numpy())\n", " if not train_on_gpu\n", " else np.squeeze(correct_tensor.cpu().numpy())\n", " )\n", " # calculate test accuracy for each object class\n", " for i in range(batch_size):\n", " label = target.data[i]\n", " quantized_class_correct[label] += correct[i].item()\n", " quantized_class_total[label] += 1\n", "\n", "# average test loss\n", "quantized_test_loss = quantized_test_loss / len(test_loader)\n", "print(\"Test Loss: {:.6f}\\n\".format(quantized_test_loss))\n", "\n", "for i in range(10):\n", " if quantized_class_total[i] > 0:\n", " print(\n", " \"Test Accuracy of %5s: %2d%% (%2d/%2d)\"\n", " % (\n", " classes[i],\n", " 100 * quantized_class_correct[i] / quantized_class_total[i],\n", " np.sum(quantized_class_correct[i]),\n", " np.sum(quantized_class_total[i]),\n", " )\n", " )\n", " else:\n", " print(\"Test Accuracy of %5s: N/A (no training examples)\" % (classes[i]))\n", "\n", "print(\n", " \"\\nTest Accuracy (Overall): %2d%% (%2d/%2d)\"\n", " % (\n", " 100.0 * np.sum(quantized_class_correct) / np.sum(quantized_class_total),\n", " np.sum(quantized_class_correct),\n", " np.sum(quantized_class_total),\n", " )\n", ")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The result is that the test accuracy is really similar for the initial model and for the quantized model." ] }, { "cell_type": "markdown", "id": "201470f9", "metadata": {}, "source": [ "## Exercise 3: working with pre-trained models.\n", "\n", "PyTorch offers several pre-trained models https://pytorch.org/vision/0.8/models.html \n", "We will use ResNet50 trained on ImageNet dataset (https://www.image-net.org/index.php). Use the following code with the files `imagenet-simple-labels.json` that contains the imagenet labels and the image dog.png that we will use as test.\n" ] }, { "cell_type": "code", "execution_count": null, "id": "b4d13080", "metadata": {}, "outputs": [], "source": [ "import json\n", "from PIL import Image\n", "\n", "# Choose an image to pass through the model\n", "test_image = \"dog.png\"\n", "\n", "# Configure matplotlib for pretty inline plots\n", "#%matplotlib inline\n", "#%config InlineBackend.figure_format = 'retina'\n", "\n", "# Prepare the labels\n", "with open(\"imagenet-simple-labels.json\") as f:\n", " labels = json.load(f)\n", "\n", "# First prepare the transformations: resize the image to what the model was trained on and convert it to a tensor\n", "data_transform = transforms.Compose(\n", " [\n", " transforms.Resize((224, 224)),\n", " transforms.ToTensor(),\n", " transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225]),\n", " ]\n", ")\n", "# Load the image\n", "\n", "image = Image.open(test_image)\n", "plt.imshow(image), plt.xticks([]), plt.yticks([])\n", "\n", "# Now apply the transformation, expand the batch dimension, and send the image to the GPU\n", "# image = data_transform(image).unsqueeze(0).cuda()\n", "image = data_transform(image).unsqueeze(0)\n", "\n", "# Download the model if it's not there already. It will take a bit on the first run, after that it's fast\n", "model = models.resnet50(pretrained=True)\n", "# Send the model to the GPU\n", "# model.cuda()\n", "# Set layers such as dropout and batchnorm in evaluation mode\n", "model.eval()\n", "\n", "# Get the 1000-dimensional model output\n", "out = model(image)\n", "# Find the predicted class\n", "print(\"Predicted class is: {}\".format(labels[out.argmax()]))" ] }, { "cell_type": "markdown", "id": "184cfceb", "metadata": {}, "source": [ "Experiments:\n", "\n", "Study the code and the results obtained. Possibly add other images downloaded from the internet.\n", "\n", "What is the size of the model? Quantize it and then check if the model is still able to correctly classify the other images.\n", "\n", "Experiment with other pre-trained CNN models.\n", "\n", " \n" ] }, { "cell_type": "markdown", "id": "5d57da4b", "metadata": {}, "source": [ "## Exercise 4: Transfer Learning\n", " \n", " \n", "For this work, we will use a pre-trained model (ResNet18) as a descriptor extractor and will refine the classification by training only the last fully connected layer of the network. Thus, the output layer of the pre-trained network will be replaced by a layer adapted to the new classes to be recognized which will be in our case ants and bees.\n", "Download and unzip in your working directory the dataset available at the address :\n", " \n", "https://download.pytorch.org/tutorial/hymenoptera_data.zip\n", " \n", "Execute the following code in order to display some images of the dataset." ] }, { "cell_type": "code", "execution_count": null, "id": "be2d31f5", "metadata": {}, "outputs": [], "source": [ "import os\n", "\n", "import matplotlib.pyplot as plt\n", "import numpy as np\n", "import torch\n", "import torchvision\n", "from torchvision import datasets, transforms\n", "\n", "# Data augmentation and normalization for training\n", "# Just normalization for validation\n", "data_transforms = {\n", " \"train\": transforms.Compose(\n", " [\n", " transforms.RandomResizedCrop(\n", " 224\n", " ), # ImageNet models were trained on 224x224 images\n", " transforms.RandomHorizontalFlip(), # flip horizontally 50% of the time - increases train set variability\n", " transforms.ToTensor(), # convert it to a PyTorch tensor\n", " transforms.Normalize(\n", " [0.485, 0.456, 0.406], [0.229, 0.224, 0.225]\n", " ), # ImageNet models expect this norm\n", " ]\n", " ),\n", " \"val\": transforms.Compose(\n", " [\n", " transforms.Resize(256),\n", " transforms.CenterCrop(224),\n", " transforms.ToTensor(),\n", " transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225]),\n", " ]\n", " ),\n", "}\n", "\n", "data_dir = \"hymenoptera_data\"\n", "# Create train and validation datasets and loaders\n", "image_datasets = {\n", " x: datasets.ImageFolder(os.path.join(data_dir, x), data_transforms[x])\n", " for x in [\"train\", \"val\"]\n", "}\n", "dataloaders = {\n", " x: torch.utils.data.DataLoader(\n", " image_datasets[x], batch_size=4, shuffle=True, num_workers=0\n", " )\n", " for x in [\"train\", \"val\"]\n", "}\n", "dataset_sizes = {x: len(image_datasets[x]) for x in [\"train\", \"val\"]}\n", "class_names = image_datasets[\"train\"].classes\n", "device = torch.device(\"cuda:0\" if torch.cuda.is_available() else \"cpu\")\n", "\n", "# Helper function for displaying images\n", "def imshow(inp, title=None):\n", " \"\"\"Imshow for Tensor.\"\"\"\n", " inp = inp.numpy().transpose((1, 2, 0))\n", " mean = np.array([0.485, 0.456, 0.406])\n", " std = np.array([0.229, 0.224, 0.225])\n", "\n", " # Un-normalize the images\n", " inp = std * inp + mean\n", " # Clip just in case\n", " inp = np.clip(inp, 0, 1)\n", " plt.imshow(inp)\n", " if title is not None:\n", " plt.title(title)\n", " plt.pause(0.001) # pause a bit so that plots are updated\n", " plt.show()\n", "\n", "\n", "# Get a batch of training data\n", "inputs, classes = next(iter(dataloaders[\"train\"]))\n", "\n", "# Make a grid from batch\n", "out = torchvision.utils.make_grid(inputs)\n", "\n", "imshow(out, title=[class_names[x] for x in classes])\n", "\n" ] }, { "cell_type": "markdown", "id": "bbd48800", "metadata": {}, "source": [ "Now, execute the following code which uses a pre-trained model ResNet18 having replaced the output layer for the ants/bees classification and performs the model training by only changing the weights of this output layer." ] }, { "cell_type": "code", "execution_count": null, "id": "572d824c", "metadata": {}, "outputs": [], "source": [ "import copy\n", "import os\n", "import time\n", "\n", "import matplotlib.pyplot as plt\n", "import numpy as np\n", "import torch\n", "import torch.nn as nn\n", "import torch.optim as optim\n", "import torchvision\n", "from torch.optim import lr_scheduler\n", "from torchvision import datasets, transforms\n", "\n", "# Data augmentation and normalization for training\n", "# Just normalization for validation\n", "data_transforms = {\n", " \"train\": transforms.Compose(\n", " [\n", " transforms.RandomResizedCrop(\n", " 224\n", " ), # ImageNet models were trained on 224x224 images\n", " transforms.RandomHorizontalFlip(), # flip horizontally 50% of the time - increases train set variability\n", " transforms.ToTensor(), # convert it to a PyTorch tensor\n", " transforms.Normalize(\n", " [0.485, 0.456, 0.406], [0.229, 0.224, 0.225]\n", " ), # ImageNet models expect this norm\n", " ]\n", " ),\n", " \"val\": transforms.Compose(\n", " [\n", " transforms.Resize(256),\n", " transforms.CenterCrop(224),\n", " transforms.ToTensor(),\n", " transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225]),\n", " ]\n", " ),\n", "}\n", "\n", "data_dir = \"hymenoptera_data\"\n", "# Create train and validation datasets and loaders\n", "image_datasets = {\n", " x: datasets.ImageFolder(os.path.join(data_dir, x), data_transforms[x])\n", " for x in [\"train\", \"val\"]\n", "}\n", "dataloaders = {\n", " x: torch.utils.data.DataLoader(\n", " image_datasets[x], batch_size=4, shuffle=True, num_workers=4\n", " )\n", " for x in [\"train\", \"val\"]\n", "}\n", "dataset_sizes = {x: len(image_datasets[x]) for x in [\"train\", \"val\"]}\n", "class_names = image_datasets[\"train\"].classes\n", "device = torch.device(\"cuda:0\" if torch.cuda.is_available() else \"cpu\")\n", "\n", "# Helper function for displaying images\n", "def imshow(inp, title=None):\n", " \"\"\"Imshow for Tensor.\"\"\"\n", " inp = inp.numpy().transpose((1, 2, 0))\n", " mean = np.array([0.485, 0.456, 0.406])\n", " std = np.array([0.229, 0.224, 0.225])\n", "\n", " # Un-normalize the images\n", " inp = std * inp + mean\n", " # Clip just in case\n", " inp = np.clip(inp, 0, 1)\n", " plt.imshow(inp)\n", " if title is not None:\n", " plt.title(title)\n", " plt.pause(0.001) # pause a bit so that plots are updated\n", " plt.show()\n", "\n", "\n", "# Get a batch of training data\n", "# inputs, classes = next(iter(dataloaders['train']))\n", "\n", "# Make a grid from batch\n", "# out = torchvision.utils.make_grid(inputs)\n", "\n", "# imshow(out, title=[class_names[x] for x in classes])\n", "# training\n", "\n", "\n", "def train_model(model, criterion, optimizer, scheduler, num_epochs=25):\n", " since = time.time()\n", "\n", " best_model_wts = copy.deepcopy(model.state_dict())\n", " best_acc = 0.0\n", "\n", " epoch_time = [] # we'll keep track of the time needed for each epoch\n", "\n", " for epoch in range(num_epochs):\n", " epoch_start = time.time()\n", " print(\"Epoch {}/{}\".format(epoch + 1, num_epochs))\n", " print(\"-\" * 10)\n", "\n", " # Each epoch has a training and validation phase\n", " for phase in [\"train\", \"val\"]:\n", " if phase == \"train\":\n", " scheduler.step()\n", " model.train() # Set model to training mode\n", " else:\n", " model.eval() # Set model to evaluate mode\n", "\n", " running_loss = 0.0\n", " running_corrects = 0\n", "\n", " # Iterate over data.\n", " for inputs, labels in dataloaders[phase]:\n", " inputs = inputs.to(device)\n", " labels = labels.to(device)\n", "\n", " # zero the parameter gradients\n", " optimizer.zero_grad()\n", "\n", " # Forward\n", " # Track history if only in training phase\n", " with torch.set_grad_enabled(phase == \"train\"):\n", " outputs = model(inputs)\n", " _, preds = torch.max(outputs, 1)\n", " loss = criterion(outputs, labels)\n", "\n", " # backward + optimize only if in training phase\n", " if phase == \"train\":\n", " loss.backward()\n", " optimizer.step()\n", "\n", " # Statistics\n", " running_loss += loss.item() * inputs.size(0)\n", " running_corrects += torch.sum(preds == labels.data)\n", "\n", " epoch_loss = running_loss / dataset_sizes[phase]\n", " epoch_acc = running_corrects.double() / dataset_sizes[phase]\n", "\n", " print(\"{} Loss: {:.4f} Acc: {:.4f}\".format(phase, epoch_loss, epoch_acc))\n", "\n", " # Deep copy the model\n", " if phase == \"val\" and epoch_acc > best_acc:\n", " best_acc = epoch_acc\n", " best_model_wts = copy.deepcopy(model.state_dict())\n", "\n", " # Add the epoch time\n", " t_epoch = time.time() - epoch_start\n", " epoch_time.append(t_epoch)\n", " print()\n", "\n", " time_elapsed = time.time() - since\n", " print(\n", " \"Training complete in {:.0f}m {:.0f}s\".format(\n", " time_elapsed // 60, time_elapsed % 60\n", " )\n", " )\n", " print(\"Best val Acc: {:4f}\".format(best_acc))\n", "\n", " # Load best model weights\n", " model.load_state_dict(best_model_wts)\n", " return model, epoch_time\n", "\n", "\n", "# Download a pre-trained ResNet18 model and freeze its weights\n", "model = torchvision.models.resnet18(pretrained=True)\n", "for param in model.parameters():\n", " param.requires_grad = False\n", "\n", "# Replace the final fully connected layer\n", "# Parameters of newly constructed modules have requires_grad=True by default\n", "num_ftrs = model.fc.in_features\n", "model.fc = nn.Linear(num_ftrs, 2)\n", "# Send the model to the GPU\n", "model = model.to(device)\n", "# Set the loss function\n", "criterion = nn.CrossEntropyLoss()\n", "\n", "# Observe that only the parameters of the final layer are being optimized\n", "optimizer_conv = optim.SGD(model.fc.parameters(), lr=0.001, momentum=0.9)\n", "exp_lr_scheduler = lr_scheduler.StepLR(optimizer_conv, step_size=7, gamma=0.1)\n", "model, epoch_time = train_model(\n", " model, criterion, optimizer_conv, exp_lr_scheduler, num_epochs=10\n", ")\n" ] }, { "cell_type": "markdown", "id": "bbd48800", "metadata": {}, "source": [ "Experiments:\n", "Study the code and the results obtained.\n", "\n", "Modify the code and add an \"eval_model\" function to allow\n", "the evaluation of the model on a test set (different from the learning and validation sets used during the learning phase). Study the results obtained.\n", "\n", "Now modify the code to replace the current classification layer with a set of two layers using a \"relu\" activation function for the middle layer, and the \"dropout\" mechanism for both layers. Renew the experiments and study the results obtained.\n", "\n", "Apply ther quantization (post and quantization aware) and evaluate impact on model size and accuracy." ] }, { "cell_type": "markdown", "id": "04a263f0", "metadata": {}, "source": [ "## Optional\n", " \n", "Try this at home!! \n", "\n", "\n", "Pytorch offers a framework to export a given CNN to your selfphone (either android or iOS). Have a look at the tutorial https://pytorch.org/mobile/home/\n", "\n", "The Exercise consists in deploying the CNN of Exercise 4 in your phone and then test it on live.\n", "\n" ] }, { "cell_type": "markdown", "id": "fe954ce4", "metadata": {}, "source": [ "## Author\n", "\n", "Alberto BOSIO - Ph. D." ] } ], "metadata": { "kernelspec": { "display_name": "Python 3.8.5 ('base')", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.11.5" }, "vscode": { "interpreter": { "hash": "9e3efbebb05da2d4a1968abe9a0645745f54b63feb7a85a514e4da0495be97eb" } } }, "nbformat": 4, "nbformat_minor": 5 }