Skip to content

ST-NILM is a new integrated architecture based on the Scattering Transform. It has a DCN (Deep Convolutional Network) with analytical wavelet-based non-trained weights, shared with fully connected output networks that perform event detection and multi-label classification of aggregate loads.

Notifications You must be signed in to change notification settings

LucasNolasco/ST-NILM

Repository files navigation

ST-NILM

This repository hosts the official implementation of the method proposed in the paper "ST-NILM: A Wavelet Scattering-Based Architecture for Feature Extraction and Multi-Label Classification in NILM Signals", published on the IEEE Sensors Journal. It is heavily inspired by the DeepDFML-NILM architecture, aiming to reduce the amount of data needed to achieve state-of-the-art results by introducing Scattering Transform layers to the original architecture.

dfml_ScatDFML4


Dependencies

The model was implemented on Python 3.8 using Tensorflow/Keras. To install all dependencies for this project, run:

$ cd ScatNILM
$ pip install -r requirements.txt

Dict structure

This implementation uses a dictionary structure to define some of the execution parameters. The fields of this dictionary are:

  • N_GRIDS: Total positions of the grid (default = 5).
  • N_CLASS: Total of loads on the dataset (default = 26).
  • SIGNAL_BASE_LENGTH: Total of mapped samples on each signal cut (default = 12800, 50 electrical network cycles).
  • MARGIN_RATIO: Size of the unmapped margins defined by a portion of the signal. (default = 0.15).
  • DATASET_PATH: Path to the .hdf5 file containing the samples.
  • TRAIN_SIZE: Ratio of the examples used for training (default = 0.8). (Only used if the kfold is not performed)
  • FOLDER_PATH: Path to the folder where the model shall be stored.
  • FOLDER_DATA_PATH: Path to the *.p files with the already processed data. Usually it's the same that FOLDER_PATH.
  • N_EPOCHS_TRAINING: Total of epochs for training. (default = 250)
  • INITIAL_EPOCH: Initial epoch to continue a training, only useful if a training will be continued. (default = 0).
  • TOTAL_MAX_EPOCHS: Max of training epochs.
  • SNRdB: Noise level on dB.

Dataset Waveforms

The LIT-Dataset is a public dataset and can be downloaded on this link. However, only MATLAB tools are provided. In order to use the dataset with this implementation, a version on *.hdf5 can be downloaded on the following link. The dataset is stored on this file in the following hierarchical structure:

  • 1 -> Total number of loads on each waveform
    • i -> Array containing all the samples for each waveform
    • events -> Array containing the events array. Each event array has the same length as the waveform. If a position has a 0, there is no event. If it has a 1, there is an ON event on the sample with the same index, and if it has a -1, there is an OFF event.
    • labels -> Array of connected loads. The connected loads for each waveform are represented by an array with the labels of the connected loads in the order of the events. So, if there is only one appliance, the array shall look like: ["A", "A"].
  • 2
    • i
    • events
    • labels
  • 3
    • i
    • events
    • labels
  • 8
    • i
    • events
    • labels

To use this file with this implementation, download it and place the Synthetic_Full_iHall.hdf5 file in the ScatNILM directory.


How to run

To train, install all dependencies, configure the dictionary structure on the file src/main.py and run it as follows:

$ cd src
$ python3 main.py

Also, there are a few notebooks in the folder notebooks for evaluation of the models and some visualization tests.

Tests on Jetson TX1

The tests on Jetson TX1 are detailed in this tutorial.


DeepDFML-NILM

This repository is based on the project DeepDFML-NILM: A New CNN-Based Architecture for Detection, Feature Extraction and Multi-Label Classification in NILM Signals.

Cite

If this work helped you somehow, here is a way to cite it:

@ARTICLE{10436052,
  author={De Aguiar, Everton Luiz and Nolasco, Lucas da Silva and Lazzaretti, André Eugenio and Pipa, Daniel Rodrigues and Lopes, Heitor Silvério},
  journal={IEEE Sensors Journal}, 
  title={ST-NILM: A Wavelet Scattering-Based Architecture for Feature Extraction and Multi-Label Classification in NILM Signals}, 
  year={2024},
  volume={},
  number={},
  pages={1-1},
  keywords={Feature extraction;Convolutional neural networks;Time-frequency analysis;Scattering;Sensors;Data mining;Convolution;Deep learning;Multi-label classification;NILM;Wavelet Scattering},
  doi={10.1109/JSEN.2024.3360188}}
@ARTICLE{Nolasco2022,
  author={Nolasco, Lucas da Silva and Lazzaretti, André Eugenio and Mulinari, Bruna Machado},
  journal={IEEE Sensors Journal}, 
  title={DeepDFML-NILM: A New CNN-Based Architecture for Detection, Feature Extraction and Multi-Label Classification in NILM Signals}, 
  year={2022},
  volume={22},
  number={1},
  pages={501-509},
  doi={10.1109/JSEN.2021.3127322}}

About

ST-NILM is a new integrated architecture based on the Scattering Transform. It has a DCN (Deep Convolutional Network) with analytical wavelet-based non-trained weights, shared with fully connected output networks that perform event detection and multi-label classification of aggregate loads.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published