Источник
SenSys
Дата публикации
06.05.2025
Авторы
Петр Иванов Мария Штарк Александр Кожевников Илья Макаров
Поделиться

Poster Abstract: Exploring the Autoencoder Sequence Pooling

Аннотация

Sequence embeddings are essential for tasks like time series analysisand natural language processing, yet pooling techniques tocreate compact sequence representations remain underexplored.Poor pooling methods can lead to significant information loss, diminishingthe effectiveness of strong feature extractors. In this earlyresults paper, we propose an autoencoder-based sequence poolingapproach that leverages autoencoders’ ability to compress informationand supports pretraining during self-supervised learning.Evaluated on time series data with a transformer-based encoder,our method outperforms traditional pooling techniques, such asmathematical functions and learnable weighted sums.

Присоединяйтесь к AIRI в соцсетях