Large-Scale Optimization of Eigenvalues with Applications to Control Theory
This talk introduces a subspace approach for the optimization of the jth largest eigenvalue of a large, Hermitian and analytic matrix-valued function depending on several parameters for a prescribed j. The range of the large matrix-valued function is projected onto a small subspace orthogonally, and its domain is restricted to the same subspace. This leads to eigenvalue optimization problems involving small matrix-valued functions. The subspace is expanded with the addition of the eigenvectors for the optimal parameter values of the small problem. We prove that this subspace approach converges globally for the minimization of the jth largest eigenvalue in the infinite dimensional setting. Furthermore, the rate of convergence is at least superlinear for both the minimization and the maximization problems.
We conclude the talk with a presentation of how the subspace approach can be adapted for the stability radius, as well as the H-infinity norm of large scale linear and nonlinear systems such as delay systems. The H-infinity norm requires additional novelty, as the subspace projections and restrictions must be performed on the state space.