Long/Short Term Memory (LSTM) specimen of #neuralnetwork
These are inspired mostly by circuitry, not so much biology. Each neuron has a memory cell & 3 gates: input, output & forget. Sometimes it’s good to forget: if it’s learning a book & new chapter begins it may be necessary for network to forget some characters from the previous chapter. LSTMs are able to learn complex sequences such as writing like Shakespeare or composing primitive music.
http://www.asimovinstitute.org/neural-network-zoo/ https://octodon.social/media/DOlkzyab-qfh6QI4ed8