DropInESN paper presentation @ IJCNN 2017

Abstract

The paper presents a novel, principled approach to train recurrent neural networks from the Reservoir Computing family that are robust to missing part of the input features at prediction time exploiting Dropout regularization technique to simulate missing inputs during training.

Date
Location
Anchorage, AK, USA