Displaying 41 – 60 of 92

Showing per page

Gradient descent and fast artificial time integration

Uri M. Ascher, Kees van den Doel, Hui Huang, Benar F. Svaiter (2009)

ESAIM: Mathematical Modelling and Numerical Analysis

The integration to steady state of many initial value ODEs and PDEs using the forward Euler method can alternatively be considered as gradient descent for an associated minimization problem. Greedy algorithms such as steepest descent for determining the step size are as slow to reach steady state as is forward Euler integration with the best uniform step size. But other, much faster methods using bolder step size selection exist. Various alternatives are investigated from both theoretical and practical...

Low rank Tucker-type tensor approximation to classical potentials

B. Khoromskij, V. Khoromskaia (2007)

Open Mathematics

This paper investigates best rank-(r 1,..., r d) Tucker tensor approximation of higher-order tensors arising from the discretization of linear operators and functions in ℝd. Super-convergence of the best rank-(r 1,..., r d) Tucker-type decomposition with respect to the relative Frobenius norm is proven. Dimensionality reduction by the two-level Tucker-to-canonical approximation is discussed. Tensor-product representation of basic multi-linear algebra operations is considered, including inner, outer...

Currently displaying 41 – 60 of 92