A second-order method for fitting the canonical polyadic decomposition with non-least-squares cost

Michiel Vandecappelle, Nico Vervliet, Lieven De Lathauwer


The canonical polyadic decomposition (CPD) can be used to extract meaningful components from a tensor. Most existing optimization methods for fitting the CPD use as cost function the least-squares distance between the tensor and its CPD. While the minimum of this cost function coincides with the maximum likelihood estimator for data with additive i.i.d. Gaussian distributed noise, for other noise distributions, better-suited cost functions exist. For such cost functions, first-order, gradient-based optimization methods have been proposed. However, (approximate) second-order methods, which additionally use information from the Hessian of the cost function to achieve faster convergence, are still largely unexplored. In this paper, we generalize the Gauss–Newton nonlinear least-squares algorithm to twice differentiable entry-wise cost functions. The low-rank structure of the problem is exploited to keep the computational cost low. As a special case, beta-divergence cost functions are examined. We show that quadratic convergence can be obtained close to the solution with a reasonable extra cost in memory and computation time, making the proposed method particularly useful when high accuracy of the decomposition is desired.

Code description

This package provides an implementation of the canonical polyadic decomposition methods for general cost functions discussed in the paper, as well as a tutorial on how to use these methods and files to generate the experiments from the papers.


M. Vandecappelle, N. Vervliet, and L. De Lathauwer, "A second-order method for fitting the canonical polyadic decomposition with non-least-squares cost," IEEE Transactions on Signal Processing, vol. 68, pp. 4454–4465, Aug. 2020.

Download code

This repository can be cited as:
S. Hendrikx, M. Boussé, N. Vervliet, M. Vandecappelle, R. Kenis, and L. De Lathauwer, Tensorlab⁺, Available online, Version of Dec 2022 downloaded from https://www.tensorlabplus.net.