### A characterization of the approximation order for multivariate spline spaces

Skip to main content (access key 's'),
Skip to navigation (access key 'n'),
Accessibility information (access key '0')

The paper gives such an iterative method for special Chebyshev approxiamtions that its order of convergence is $\ge 2$. Somewhat comparable results are found in [1] and [2], based on another idea.

Shifting a numerically given function ${b}_{1}exp{a}_{1}t+\cdots +{b}_{n}exp{a}_{n}t$ we obtain a fundamental matrix of the linear differential system $\dot{y}=Ay$ with a constant matrix $A$. Using the fundamental matrix we calculate $A$, calculating the eigenvalues of $A$ we obtain ${a}_{1},\cdots ,{a}_{n}$ and using the least square method we determine ${b}_{1},\cdots ,{b}_{n}$.

Necessity of computing large sparse Hessian matrices gave birth to many methods for their effective approximation by differences of gradients. We adopt the so-called direct methods for this problem that we faced when developing programs for nonlinear optimization. A new approach used in the frame of symmetric sequential coloring is described. Numerical results illustrate the differences between this method and the popular Powell-Toint method.

In this paper, we propose a method for the approximation of the solution of high-dimensional weakly coercive problems formulated in tensor spaces using low-rank approximation formats. The method can be seen as a perturbation of a minimal residual method with a measure of the residual corresponding to the error in a specified solution norm. The residual norm can be designed such that the resulting low-rank approximations are optimal with respect to particular norms of interest, thus allowing to take...