The paper presents a novel, principled approach to train recurrent neural networks from the Reservoir Computing family that are robust to missing part of the input features at prediction time exploiting Dropout regularization technique to simulate missing inputs during training.