We establish a new perturbation theory for orthogonal polynomials using a
Riemann–Hilbert approach and consider applications in numerical linear algebra
and random matrix theory. This new approach shows that the orthogonal
polynomials with respect to two measures can be effectively compared using the
difference of their Stieltjes transforms on a suitably chosen contour.
Moreover, when two measures are close and satisfy some regularity conditions,
we use the theta functions of a hyperelliptic Riemann surface to derive
explicit and accurate expansion formulae for the perturbed orthogonal
polynomials.
In contrast to other approaches, a key strength of the methodology is that
estimates can remain valid as the degree of the polynomial grows. The results
are applied to analyze several numerical algorithms from linear algebra,
including the Lanczos tridiagonalization procedure, the Cholesky factorization
and the conjugate gradient algorithm. As a case study, we investigate these
algorithms applied to a general spiked sample covariance matrix model by
considering the eigenvector empirical spectral distribution and its limits. For
the first time, we give precise estimates on the output of the algorithms,
applied to this wide class of random matrices, as the number of iterations
diverges. In this setting, beyond the first order expansion, we also derive a
new mesoscopic central limit theorem for the associated orthogonal polynomials
and other quantities relevant to numerical algorithms.