Stabilize Sequential Data Representation via Attraction Module
Abstract
Artificial intelligence systems operating in the sequential decision making paradigm are inevitably required to do effective spatio-temporal processing. The memory models for such systems are often required not just to memorize the observed data stream, but also to encode it so it is possible to separate dissimilar sequences and consolidate similar ones. Moreover, for solving complex problems, it is advantageous to have the ability to treat sequences as unit abstractions, which imposes restrictions on the topology of the representation space and the information contained in the representations themselves. In this paper, we propose a method for encoding sequences that allows efficient memorization, but at the same time retains the degree of similarity between sequences. We based our approach on the combination of biologically-inspired temporal memory and spatial attractor that stabilize temporal coding. The experiments performed on synthetic data confirm the coding efficiency and allow us to identify promising directions for further development of methods.
Similar publications
partnership