DropIn-ESN

DropIn-ESN – An ESN implementation that is robust to missing inputs

Matlab code for the Echo State Network model implementing the DropIn method to make the neural network robust to missing inputs at test times.

The code is maintained in the Github of my student Francesco Crecchi, who is to be credited for the implementation. To download the code and the scripts necessary to replicate the experiments in the DropIn paper, please go here.

The code is provided as is with no warranty and technical support. Please inform the author (Davide Bacciu) if you intend to redistribute the code.

Citation

If you find this code useful, please remember to cite:

Bacciu Davide, Crecchi Francesco, Morelli Davide: DropIn: Making Neural Networks Robust to Missing Inputs by Dropout. Proceedings of the 2017 International Joint Conference on Neural Networks (IJCNN 2017) , IEEE, 2017, ISBN: 978-1-5090-6182-2.

BibTeX (Download)

@conference{ijcnn2017,
title = {DropIn: Making Neural Networks Robust to Missing Inputs by Dropout},
author = {Bacciu Davide and Crecchi Francesco and Morelli Davide},
url = {https://arxiv.org/abs/1705.02643},
doi = {10.1109/IJCNN.2017.7966106},
isbn = {978-1-5090-6182-2},
year  = {2017},
date = {2017-05-19},
booktitle = {Proceedings  of the 2017 International Joint Conference on Neural Networks (IJCNN 2017) },
pages = {2080-2087},
publisher = {IEEE},
abstract = {The paper presents a novel, principled approach to train recurrent neural networks from the Reservoir Computing family that are robust to missing part of the input features at prediction time. By building on the ensembling properties of Dropout regularization, we propose a methodology, named DropIn, which efficiently trains a neural model as a committee machine of subnetworks, each capable of predicting with a subset of the original input features. We discuss the application of the DropIn methodology in the context of Reservoir Computing models and targeting applications characterized by input sources that are unreliable or prone to be disconnected, such as in pervasive wireless sensor networks and ambient intelligence. We provide an experimental assessment using real-world data from such application domains, showing how the Dropin methodology allows to maintain predictive performances comparable to those of a model without missing features, even when 20%–50% of the inputs are not available.},
keywords = {ambient assisted living, deep learning, Echo state networks, recurrent neural network, reservoir computing},
pubstate = {published},
tppubtype = {conference}
}