Skip to content

Latest commit

 

History

History
181 lines (140 loc) · 7.27 KB

File metadata and controls

181 lines (140 loc) · 7.27 KB

CIFAR-10 Image Classification with numpy only

Example on Image Classification with the help of CIFAR-10 dataset and Convolutional Neural Network.
DOI

Test online here

Content

Short description of the content. Full codes you can find inside the course by link above:


In this example we'll test CNN for Image Classification with the help of CIFAR-10 dataset.
Following standard and most common parameters can be used and tested:

Parameter Description
Weights Initialization HE Normal
Weights Update Policy Vanilla SGD, Momentum SGD, RMSProp, Adam
Activation Functions ReLU, Sigmoid
Regularization L2, Dropout
Pooling Max, Average
Loss Functions Softmax, SVM


Contractions:

  • Vanilla SGD - Vanilla Stochastic Gradient Descent
  • Momentum SGD - Stochastic Gradient Descent with Momentum
  • RMSProp - Root Mean Square Propagation
  • Adam - Adaptive Moment Estimation
  • SVM - Support Vector Machine


For current example following architecture will be used:
Input --> Conv --> ReLU --> Pool --> Affine --> ReLU --> Affine --> Softmax


For current example following parameters will be used:

Parameter Description
Weights Initialization HE Normal
Weights Update Policy Vanilla SGD
Activation Functions ReLU
Regularization L2
Pooling Max
Loss Functions Softmax

First step is to prepare data from CIFAR-10 dataset.


After all batches were load and concatenated all together it is possible to show examples of training images. Result can be seen on the image below.

CIFAR-10_examples


Next, creating function for preprocessing CIFAR-10 datasets for further use in classifier.

  • Normalizing data by dividing / 255.0 (!) - up to researcher
  • Normalizing data by subtracting mean image and dividing by standard deviation (!) - up to researcher
  • Transposing every dataset to make channels come first
  • Returning result as dictionary

As a result there will be following:

  • x_train: (49000, 3, 32, 32)
  • y_train: (49000,)
  • x_validation: (1000, 3, 32, 32)
  • y_validation: (1000,)
  • x_test: (1000, 3, 32, 32)
  • y_test: (1000,)

Saving loaded, prepared and preprocessed CIFAR-10 datasets into pickle file.


Creating functions for CNN layers:

  • Naive Forward Pass for Convolutional layer
  • Naive Backward Pass for Convolutional layer
  • Naive Forward Pass for Max Pooling layer
  • Naive Backward Pass for Max Pooling layer
  • Forward Pass for Affine layer
  • Backward Pass for Affine layer
  • Forward Pass for ReLU layer
  • Backward Pass for ReLU layer
  • Softmax Classification loss

Creating model of CNN Classifier:

  • Creating class for ConvNet1
  • Initializing new Network
  • Evaluating loss for training ConvNet1
  • Calculating scores for predicting ConvNet1

Using different types of optimization rules to update parameters of the Model.

Rule for updating parameters is as following:

Vanilla SGD


Creating Solver class for training classification models and for predicting:

  • Creating and Initializing class for Solver
  • Creating 'reset' function for defining variables for optimization
  • Creating function 'step' for making single gradient update
  • Creating function for checking accuracy of the model on the current provided data
  • Creating function for training the model

Overfitting Small Data with 100 training examples and 500 epochs is shown on the figure below. Overfitting Small Data

Overfitting Small Data with 10 training examples and 20 epochs is shown on the figure below. Overfitting Small Data


Training process of Model #1 with 50 000 iterations is shown on the figure below:

Training Model 1

Initialized Filters and Trained Filters for ConvNet Layer is shown on the figure below:

Filters Cifar10


MIT License

Copyright (c) 2018 Valentyn N Sichkar

github.com/sichkar-valentyn

Reference to:

Valentyn N Sichkar. Neural Networks for computer vision in autonomous vehicles and robotics // GitHub platform. DOI: 10.5281/zenodo.1317904