A Theory of AI
Long Short-Term Memory (LSTM) network is a type of recurrent neural network (RNN) designed to model long-term dependencies. LSTMs incorporate special memory cells that control the flow of information between inputs, previous states, current states, and output by a set of activated functions called gates. These memory cells preserve the error back-propagated through time and layers during training. LSTM networks thus avoid vanishing gradients and allow for uniform credit assignment.