Skip to content

Releases: MatthieuHernandez/StraightforwardNeuralNetwork

v1.8.0

05 Mar 17:46
55fc1e9
Compare
Choose a tag to compare

Neural Network Models and Architectures

  • Layers
    • Rework all filter layers (Convolution, LocallyConnected, MaxPooling)
    • Change the shape of the Input layer to (C, X, Y) instead of (X, Y, C)
  • Neurons
    • Fix the learning for the neurons in the convolutional layers
  • Features
    • Update Boost to 1.80.0

Learning Algorithms and Optimizations

  • Optimizers
    • Adapt SGD to new filter layers
    • Implement Softmax as LayerOptimizer

Tests

  • Improve test stability
  • Add tests on filter layers
  • Datasets
    • Improve the best MNIST neural network
    • Improve the best Fashion-MNIST neural network
    • Improve the best CIFAR-10 neural network

v1.7.1

08 May 17:32
e0b99a8
Compare
Choose a tag to compare

Neural Network Models and Architectures

  • Layers
    • Improve 2D filter layers
  • Neurons
    • Bias is no longer constant but always initialize to 1
  • Features
    • Add function to display 2D filter layers as bitmap
    • Add function to display 2D input data as bitmap

Learning Algorithms and Optimizations

  • Layer Optimizers
    • Add Error Multiplier

Data

  • Normalize data between [0, 1] instead of [-1, 1]

Tests

  • Improve test stability
  • Datasets
    • Improve the best MNIST neural network
    • Improve the best Fashion-MNIST neural network
    • Improve the best CIFAR-10 neural network

v1.7.0

23 Mar 19:34
12f938b
Compare
Choose a tag to compare

Neural Network Models and Architectures

  • Layers
    • Fix Convolution backpropagation
    • Fix MaxPooling backpropagation
  • Activation Functions
    • Fix ReLU
    • Add GELU

Learning Algorithms and Optimizations

  • Learning
    • Fix MEA stop condition
    • Improve NaN detection
  • Optimizations
    • Implement SIMD optimization with OpenMP

Tests

  • Improve test stability
  • Datasets
    • Improve the best MNIST neural network
    • Improve the best Fashion-MNIST neural network
    • Improve the best CIFAR-10 neural network

v1.6.0

08 Mar 00:18
2f95093
Compare
Choose a tag to compare

Learning Algorithms and Optimizations

  • Layer Optimizers
    • Add L1 Regularization
    • Add L2 Regularization
    • Fix bug on Dropout

Documentation

  • Update documentation

v1.5.0

18 Feb 17:41
38d4f3e
Compare
Choose a tag to compare

Neural Network Models and Architectures

  • Layers
    • Add MaxPooling layer

Tests

  • Dataset tests
    • Add a pre-trained neural networks for MNIST with highest accuracy
    • Add a pre-trained neural networks for Fashion-MNIST with highest accuracy
    • Add a pre-trained neural networks for CIFAR-10 with highest accuracy

v1.4.0

12 Jan 17:50
13ca1bd
Compare
Choose a tag to compare

Learning Algorithms and Optimizations

  • Add Batch
  • Add synchronous training function
  • Avoid evaluation twice in a row

Code Improvement and Optimization

  • Manage NaN value during training
  • Use C++ template for the neuron class

Documentation

  • Update documentation

v1.3.0

15 Nov 12:37
a0e0e91
Compare
Choose a tag to compare

Learning Algorithms and Optimizations

  • Layer Optimizers
    • Add Dropout

Tests

  • Improve GitHub Actions
    • Enable stricter compilation options
    • Correctly compile code with GCC on Linux and MSVC on Windows

v1.2.2

11 Sep 23:55
043df1e
Compare
Choose a tag to compare

Documentation

  • Add full GitHub Pages documentaion (here)
  • Update README

v1.2.1

28 Aug 13:06
71b81e8
Compare
Choose a tag to compare

Learning Algorithms and Optimizations

  • Back-propagation algorithms
    • Fix the calculation of the error: Now the value used to train the neural networks is correctly the derivative of the squared error.

v1.2.0

14 Aug 10:08
1a739e5
Compare
Choose a tag to compare

Neural Network Models and Architectures

  • Layers
    • Add GRU layer
  • Neurons
    • Add Gated Recurrent Unit