Long short-term memory can process single data points and entire sequences of information. The LSTM unit is made out of a cell, an input gate, an output gate and a forget gate. LSTMs were created to manage the vanishing gradient issue that can be experienced when preparing conventional RNNs. Uses of LSTM incorporates: Robot control, Time series prediction, Speech recognition, Rhythm learning, Music composition, Grammar learning and Handwriting recognition, etc. LSTMs are unequivocally intended to stay away from long-term dependency issues.