Source
IEEE Journal of Selected Topics in Signal Processing
DATE OF PUBLICATION
09/26/2024
Authors
Anh-Huy Phan
Andrzej Cichocki
Dmitri Ermilov
Nikolay Kozyrskiy
Igor Vorona
Konstantin Sobolev
Share
How to Train Your Unstable Looped Tensor Network
Deep Neural Networks,
Convolutional Neural Networks,
Tensor Decomposition,
Tensor Chain,
Tensor Train,
Tensor Networks,
Stability,
Sensitivity,
NN Compression,
Numerical Stability
Abstract
This paper addresses a substantial question of how to compress Deep Neural Networks with convolutional kernels modeled as looped tensor networks or Tensor Chain (TC) while it is known that such tensor network (TN) encounters severe numerical instability.We study the perturbation of this TN, provide an interpretation of instability in TC, propose novel methods to gain stability of the decomposition and keep the tensor network robust, and attain better approximation. Experimental results will confirm the superiority of the proposed methods in the compression of well-known convolutional neural networks, and TC decomposition under challenging scenarios.
Similar publications
You can ask us a question or suggest a joint project in the field of AI
partner@airi.net
For scientific cooperation and
partnership
partnership
pr@airi.net
For journalists and media