## AI 生成解读视频

AI抽取解析论文重点内容自动生成视频 ## AI 溯源

AI解析本论文相关学术脉络 ## AI 精读

AI抽取本论文的概要总结

We summarize the functional analytic approach in section 2, mainly to give those familiar with the tradition a guide for recognizing what happens in the rest of the paper

# Solving Ill-Conditioned and Singular Linear Systems: A Tutorial on Regularization.

SIAM Review, no. 3 (1998): 636-666

EI

It is shown that the basic regularization procedures for finding meaningful approximate solutions of ill-conditioned or singular linear systems can be phrased and analyzed in terms of classical linear algebra that can be taught in any numerical analysis course. Apart from rewriting many known results in a more elementary form, we also der...更多

• In many applications of linear algebra, the need arises to find a good approximation xto a vector x ∈ Rn satisfying an approximate equation Ax ≈ y with ill-conditioned or singular A ∈ Rm×n, given y ∈ Rm.

Usually, y is the result of measurements contaminated by small errors.
• Section 11 extends the stochastic approach to the situation where the smoothness condition x = Sw is replaced by the condition that some vector Jx, usually composed of suitably weighted finite differences of function values, is reasonably bounded.

• In many applications of linear algebra, the need arises to find a good approximation xto a vector x ∈ Rn satisfying an approximate equation Ax ≈ y with ill-conditioned or singular A ∈ Rm×n, given y ∈ Rm.

Usually, y is the result of measurements contaminated by small errors
• The importance of the problem can be seen from a glance at the following probably incomplete list of applications: numerical differentiation of noisy data, nonparametric smoothing of curves and surfaces defined by scattered data, multivariate approximation by radial basis functions, training of neural networks, image reconstruction, deconvolution of sequences and images (Wiener filtering), shape from shading, computer-assisted tomography (CAT, PET), indirect measurements and nondestructive testing, inverse scattering, seismic analysis, parameter identification in dynamical systems, analytic continuation, inverse Laplace transforms, calculation of relaxation spectra, air pollution source detection, solution of partial differential equations with nonstandard data, and so on
• The construction of an approximation satisfying this error bound depends on the knowledge of p and δ or another constant involving δ such as δ/ω; by a theorem of Bakushinskii , any technique for choosing regularization parameters in the absence of information about the error level can be defeated by suitably constructed counterexamples whenever the pseudoinverse of T is unbounded
• By a theorem of Bakushinskii , any technique for choosing regularization parameters in the absence of information about the error level can be defeated by suitably constructed counterexamples, and the techniques in use all fail on a small proportion of problems in simulations where the right-hand side is perturbed by random noise
• If r > 0 or s > 0, we find the (r, s)-generalized cross validation (GCV) merit function frs(t) = log γrs(t) − r + s log βrs(t), where (58) (59)
• Λk r μk s c2k λk + tμk λk + tμk λk + tμk (in an important special case discussed in section 11 below, the global minimizer of the (0,1)-GCV merit function is the familiar GCV estimate for the regularization parameter.)

• For a well-posed data fitting problem, i.e., one with a well-conditioned normal equation matrix A∗A, the least squares estimate has an error of the order of ∆.
• By a theorem of Bakushinskii , any technique for choosing regularization parameters in the absence of information about the error level can be defeated by suitably constructed counterexamples, and the techniques in use all fail on a small proportion of problems in simulations where the right-hand side is perturbed by random noise.
• Λk r μk s c2k λk + tμk λk + tμk λk + tμk (in an important special case discussed in section 11 below, the global minimizer of the (0,1)-GCV merit function is the familiar GCV estimate for the regularization parameter.)
• J is usually a matrix of suitably weighted first- or second-order differences that apply to some part of x containing case 2r 2s gold silver bronze n = 500 0 0 1674 433 alg.
• If V = I, the authors may use the stochastic setting and obtain from Theorem 8.1 the optimal estimator x = Cy = (SS∗A∗A + ∆2I)−1SS∗A∗y = (A∗A + ∆2J ∗J )−1A∗y, and this formula agrees with (66).
• For large-scale problems, (66) can be solved using one Cholesky factorization for each value of t, and the authors show below how these factorizations can be used to find an appropriate regularization parameter.
• Once a good regularization parameter t is determined, the solution xof the least squares problem (66) is found by completing (85) with a back substitution, solving the triangular linear system
• If one formulates each such constraint as the condition that some linear expression Jνx is assumed to be well scaled and not too large, one may again take account of these constraints as penalty terms in the least squares problem.

• One may assume that the tν are proportional to some known constants, thereby reducing the problem to one with a single regularization parameter.
• The GML criterion generalizes in a natural way; (84)–(86) remain valid, but Lt is a Cholesky factor of Bt, and the vector t of regularization parameters may be found by a multivariate minimization of f00(t).

• In many applications of linear algebra, the need arises to find a good approximation xto a vector x ∈ Rn satisfying an approximate equation Ax ≈ y with ill-conditioned or singular A ∈ Rm×n, given y ∈ Rm.

Usually, y is the result of measurements contaminated by small errors.
• Section 11 extends the stochastic approach to the situation where the smoothness condition x = Sw is replaced by the condition that some vector Jx, usually composed of suitably weighted finite differences of function values, is reasonably bounded.
• For a well-posed data fitting problem, i.e., one with a well-conditioned normal equation matrix A∗A, the least squares estimate has an error of the order of ∆.
• By a theorem of Bakushinskii , any technique for choosing regularization parameters in the absence of information about the error level can be defeated by suitably constructed counterexamples, and the techniques in use all fail on a small proportion of problems in simulations where the right-hand side is perturbed by random noise.
• Λk r μk s c2k λk + tμk λk + tμk λk + tμk (in an important special case discussed in section 11 below, the global minimizer of the (0,1)-GCV merit function is the familiar GCV estimate for the regularization parameter.)
• J is usually a matrix of suitably weighted first- or second-order differences that apply to some part of x containing case 2r 2s gold silver bronze n = 500 0 0 1674 433 alg.
• If V = I, the authors may use the stochastic setting and obtain from Theorem 8.1 the optimal estimator x = Cy = (SS∗A∗A + ∆2I)−1SS∗A∗y = (A∗A + ∆2J ∗J )−1A∗y, and this formula agrees with (66).
• For large-scale problems, (66) can be solved using one Cholesky factorization for each value of t, and the authors show below how these factorizations can be used to find an appropriate regularization parameter.
• Once a good regularization parameter t is determined, the solution xof the least squares problem (66) is found by completing (85) with a back substitution, solving the triangular linear system
• If one formulates each such constraint as the condition that some linear expression Jνx is assumed to be well scaled and not too large, one may again take account of these constraints as penalty terms in the least squares problem.
• One may assume that the tν are proportional to some known constants, thereby reducing the problem to one with a single regularization parameter.
• The GML criterion generalizes in a natural way; (84)–(86) remain valid, but Lt is a Cholesky factor of Bt, and the vector t of regularization parameters may be found by a multivariate minimization of f00(t).

• Table1: Failure rates in percent
• Table2: Number of first, second, and third places  