Releases: MatthieuHernandez/StraightforwardNeuralNetwork
Releases · MatthieuHernandez/StraightforwardNeuralNetwork
v1.8.0
Neural Network Models and Architectures
- Layers
- Rework all filter layers (Convolution, LocallyConnected, MaxPooling)
- Change the shape of the Input layer to (C, X, Y) instead of (X, Y, C)
- Neurons
- Fix the learning for the neurons in the convolutional layers
- Features
- Update Boost to 1.80.0
Learning Algorithms and Optimizations
- Optimizers
- Adapt SGD to new filter layers
- Implement Softmax as LayerOptimizer
Tests
- Improve test stability
- Add tests on filter layers
- Datasets
- Improve the best MNIST neural network
- Improve the best Fashion-MNIST neural network
- Improve the best CIFAR-10 neural network
v1.7.1
Neural Network Models and Architectures
- Layers
- Improve 2D filter layers
- Neurons
- Bias is no longer constant but always initialize to 1
- Features
- Add function to display 2D filter layers as bitmap
- Add function to display 2D input data as bitmap
Learning Algorithms and Optimizations
- Layer Optimizers
- Add Error Multiplier
Data
- Normalize data between [0, 1] instead of [-1, 1]
Tests
- Improve test stability
- Datasets
- Improve the best MNIST neural network
- Improve the best Fashion-MNIST neural network
- Improve the best CIFAR-10 neural network
v1.7.0
Neural Network Models and Architectures
- Layers
- Fix Convolution backpropagation
- Fix MaxPooling backpropagation
- Activation Functions
- Fix ReLU
- Add GELU
Learning Algorithms and Optimizations
- Learning
- Fix MEA stop condition
- Improve NaN detection
- Optimizations
- Implement SIMD optimization with OpenMP
Tests
- Improve test stability
- Datasets
- Improve the best MNIST neural network
- Improve the best Fashion-MNIST neural network
- Improve the best CIFAR-10 neural network
v1.6.0
Learning Algorithms and Optimizations
- Layer Optimizers
- Add L1 Regularization
- Add L2 Regularization
- Fix bug on Dropout
Documentation
- Update documentation
v1.5.0
Neural Network Models and Architectures
- Layers
- Add MaxPooling layer
Tests
- Dataset tests
- Add a pre-trained neural networks for MNIST with highest accuracy
- Add a pre-trained neural networks for Fashion-MNIST with highest accuracy
- Add a pre-trained neural networks for CIFAR-10 with highest accuracy
v1.4.0
Learning Algorithms and Optimizations
- Add Batch
- Add synchronous training function
- Avoid evaluation twice in a row
Code Improvement and Optimization
- Manage NaN value during training
- Use C++ template for the neuron class
Documentation
- Update documentation
v1.3.0
Learning Algorithms and Optimizations
- Layer Optimizers
- Add Dropout
Tests
- Improve GitHub Actions
- Enable stricter compilation options
- Correctly compile code with GCC on Linux and MSVC on Windows
v1.2.2
v1.2.1
Learning Algorithms and Optimizations
- Back-propagation algorithms
- Fix the calculation of the error: Now the value used to train the neural networks is correctly the derivative of the squared error.
v1.2.0
Neural Network Models and Architectures
- Layers
- Add GRU layer
- Neurons
- Add Gated Recurrent Unit