@zzj0402_gitlab if you take a look at the more classic RNNs, they just have long-term dependencies among the data. In other words, just "long memory".
So the concept of LSTM (or long short-term memory) varies how much of this memory exists. Hence having long segments of short-term memory. Hope that helps.
Im trying findout information about ML implementations to improve on them. The reason I ask for any encoding is to apply one of my generable data encoding to the algorithms. The main intention of the generable data encoding is to lower memory useage which positively side effects lower cpu useage.