Back to search
2203.12303

Koopman-based Neural Lyapunov functions for general attractors

Shankar A. Deka, Alonso M. Valle, Claire J. Tomlin

correctmedium confidence
Category
Not specified
Journal tier
Specialist/Solid
Processed
Sep 28, 2025, 12:56 AM

Audit review

The paper’s Theorem 1 asserts that if each learned Koopman-Lyapunov basis Vi satisfies V̇i = λi Vi + εi with a uniform error bound |εi(x)| ≤ αi Vi(x)^2 + βi, then for sufficiently negative eigenvalues there exists a weighted sum V = ∑ ai Vi and a level c > 0 whose sublevel set is forward invariant. The candidate solution reconstructs a concrete proof: aggregate the Vi with positive weights, derive V̇ ≤ −γ V + R V^2 + B, optimize weights via a shape–scale decomposition to make the boundary derivative negative, and conclude invariance via a barrier (first-exit) argument. This matches the structure of the theorem and is a standard route to such an invariance result. The paper defers the detailed proof to Appendix A, but its assumptions and statement (equations (6), (8), Theorem 1) align exactly with the model’s steps; the model supplies an explicit quantitative threshold γ > 2√(∑ αi βi) that instantiates the paper’s qualitative phrase “sufficiently negative eigenvalues.”

Referee report (LaTeX)

\textbf{Recommendation:} minor revisions

\textbf{Journal Tier:} specialist/solid

\textbf{Justification:}

The theorem audited is correct and well-aligned with standard Lyapunov-based invariance reasoning. It is useful in the paper’s data-driven Koopman setting, but the proof is deferred to an appendix and the phrase “sufficiently negative eigenvalues” is qualitative. Providing an explicit sufficient threshold would improve clarity and utility without altering the substance.