Source
SenSys
DATE OF PUBLICATION
05/06/2025
Authors
Petr Ivanov Maria Shtark Alexander Kozhevnikov Ilya Makarov
Share

Poster Abstract: Exploring the Autoencoder Sequence Pooling

Abstract

Sequence embeddings are essential for tasks like time series analysisand natural language processing, yet pooling techniques tocreate compact sequence representations remain underexplored.Poor pooling methods can lead to significant information loss, diminishingthe effectiveness of strong feature extractors. In this earlyresults paper, we propose an autoencoder-based sequence poolingapproach that leverages autoencoders’ ability to compress informationand supports pretraining during self-supervised learning.Evaluated on time series data with a transformer-based encoder,our method outperforms traditional pooling techniques, such asmathematical functions and learnable weighted sums.

Join AIRI