Abstract
Linear eigenvalue analysis has provided a fundamental framework for many scientific and engineering disciplines. Consequently, vast research was devoted to numerical schemes for computing eigenfunctions. In recent years, new research in image processing and machine-learning has shown the applicability of nonlinear eigenvalue analysis, specifically based on operators induced by convex functionals. This has provided new insights, better theoretical understanding and improved image-processing, clustering and classification algorithms. However, the theory of nonlinear eigenvalue problems is still very preliminary. We present a new class of nonlinear flows that can generate nonlinear eigenfunctions of the form T(u) = λu, where T(u) is a nonlinear operator and λ∈ R is the eigenvalue. We develop the theory where T(u) is a subgradient element of a regularizing one-homogeneous functional, such as total-variation or total-generalized-variation. We focus on a forward flow which simultaneously smooths the solution (with respect to the regularizer) while increasing the 2-norm. An analog discrete flow and its normalized version are formulated and analyzed. The flows translate to a series of convex minimization steps. In addition we suggest an indicator to measure the affinity of a function to an eigenfunction and relate it to pseudo-eigenfunctions in the linear case.
Original language | English |
---|---|
Pages (from-to) | 859-888 |
Number of pages | 30 |
Journal | Journal of Scientific Computing |
Volume | 75 |
Issue number | 2 |
DOIs | |
State | Published - 1 May 2018 |
Keywords
- Nonlinear eigenfunctions
- Nonlinear flows
- Nonlinear spectral theory
- One-homogeneous functionals
- Total-variation
- Variational methods
All Science Journal Classification (ASJC) codes
- Software
- General Engineering
- Computational Mathematics
- Theoretical Computer Science
- Applied Mathematics
- Numerical Analysis
- Computational Theory and Mathematics