Program a multilayer perceptron from scratch using built-in Python functions and Numpy.
-
Modeling: Create each component of the multilayer perceptron: layers(input, hidden, output), activation layer, and loss functions.
-
Training and Testing: Train and test MLP on both regression and classification problems
-
Evaluation: For the classification problem, the MLP achieved an accuracy of 88.69% on the MNIST dataset using cross entropy loss. For the regression problem, the MLP successfully solved the XOR using MSE as the error function.
-
Layer(input, hidden, output): Each layer contains weights and bias as instance variables and forward and back propagation functions
-
Activation layer: Applies a specified activation on a layer (forward and back propagation)
-
Loss functions: MSE for regression problems and cross entropy for classification problems