The search session has expired. Please query the service again.
The search session has expired. Please query the service again.
The search session has expired. Please query the service again.
The search session has expired. Please query the service again.
The search session has expired. Please query the service again.
The search session has expired. Please query the service again.
Hybrid LSQR represents a powerful method for regularization of large-scale discrete inverse problems, where ill-conditioning of the model matrix and ill-posedness of the problem make the solutions seriously sensitive to the unknown noise in the data. Hybrid LSQR combines the iterative Golub-Kahan bidiagonalization with the Tikhonov regularization of the projected problem. While the behavior of the residual norm for the pure LSQR is well understood and can be used to construct a stopping criterion,...
Linear matrix approximation problems are often solved by the total least squares minimization (TLS). Unfortunately, the TLS solution may not exist in general. The so-called core problem theory brought an insight into this effect. Moreover, it simplified the solvability analysis if is of column rank one by extracting a core problem having always a unique TLS solution. However, if the rank of is larger, the core problem may stay unsolvable in the TLS sense, as shown for the first time by Hnětynková,...
The total least squares (TLS) and truncated TLS (T-TLS) methods are widely known linear data fitting approaches, often used also in the context of very ill-conditioned, rank-deficient, or ill-posed problems. Regularization properties of T-TLS applied to linear approximation problems were analyzed by Fierro, Golub, Hansen, and O’Leary (1997) through the so-called filter factors allowing to represent the solution in terms of a filtered pseudoinverse of applied to . This paper focuses on the situation...
Download Results (CSV)