Skip to content

Codes accompanying the paper "Deep learning with transfer functions: new applications in system identification"

License

Notifications You must be signed in to change notification settings

forgi86/sysid-transfer-functions-pytorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

28 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Deep learning with transfer functions: New applications in system identification

This repository contains the Python code to reproduce the results of the paper Deep learning with transfer functions: new applications in system identification by Dario Piga, Marco Forgione, and Manas Mejari.

We present a linear transfer function block, endowed with a well-defined and efficient back-propagation behavior for automatic derivatives computation. In the dynoNet architecture (already introduced here), linear dynamical operators are combined with static (i.e., memoryless) non-linearities which can be either elementary activation functions applied channel-wise; fully connected feed-forward neural networks; or other differentiable operators.

In this work, we use the differentiable transfer function operator to tackle other challenging problems in system identification. In particular, we consider the problems of:

  1. Learning of neural dynamical models in the presence of colored noise (prediction error minimization method)
  2. Learning of dynoNet models from quantized output observations (maximum likelihood estimation method)

Problem 1. is tackled by extending the prediction error minimization method to deep learning models. A trainable linear transfer function block is used to describe the power spectrum of the noise:

Neural PEM


Problem 2. is tackled by training a dynoNet model with a loss function corresponding to the log-likelihood of quantized observations:

ML quantized measurements

Folders:

  • torchid: PyTorch implementation of the linear dynamical operator (aka G-block in the paper) used in dynoNet
  • examples: examples using dynoNet for system identification
  • util: definition of metrics R-square, RMSE, fit index

Two examples discussed in the paper are:

For the WH2009 example, the main scripts are:

  • WH2009_train_colored_noise_PEM.py: Training of a dynoNet model with the prediction error method in presence of colored noise
  • WH2009_test.py: Evaluation of the dynoNet model on the original test dataset, computation of metrics, plots.

For the Parallel Wiener-Hammerstein example, the main scripts are:

  • parWH_train_quant_ML.py: Training of a dynoNet model with maximum likelihood in presence of quantized measurements
  • parWH_test.py: Evaluation of the dynoNet model on the original test dataset, computation of metrics, plots.

NOTE: the original data sets are not included in this project. They have to be manually downloaded from http://www.nonlinearbenchmark.org and copied in the data sub-folder of the example.

Software requirements:

Simulations were performed on a Python 3.7 conda environment with

  • numpy
  • scipy
  • matplotlib
  • pandas
  • numba
  • pytorch (version 1.6)

These dependencies may be installed through the commands:

conda install numpy scipy pandas numba matplotlib
conda install pytorch torchvision cudatoolkit=10.2 -c pytorch

Citing

If you find this project useful, we encourage you to

  • Star this repository ⭐
  • Cite the paper
@inproceedings{piga2021a,
  title={Deep learning with transfer functions: new applications in system identification},
  author={Piga, D. and Forgione, M. and Mejari, M.},
  booktitle={Proc. of the 19th IFAC Symposium System Identification: learning models for decision and control},
  year={2021}
}

About

Codes accompanying the paper "Deep learning with transfer functions: new applications in system identification"

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages