Back to search
2402.15839

Fast-Slow Neural Networks for Learning Singularly Perturbed Dynamical Systems

Daniel A. Serino, Allen Alvarez Loya, J. W. Burby, Ioannis G. Kevrekidis, Qi Tang

incompletemedium confidence
Category
Not specified
Journal tier
Strong Field
Processed
Sep 28, 2025, 12:56 AM

Audit review

The paper states Theorem 3.2 (FSNN is a universal approximator for solutions near the slow manifold) and informally argues that this follows because each FSNN component is a universal approximator, citing Theorem 2.6 (modified Fenichel normal form) and then appealing to component-level universality for h, T, B, C, g; however it does not provide a rigorous composition-to-flow argument or a precise statement of the uniform-in-time, uniform-in-initial-data error metric required for flow approximation. The text essentially asserts inheritance of universality from components without proving that small component-wise errors lead to small vector-field errors and hence small flow errors via Grönwall (nor does it specify the needed C1 closeness for the diffeomorphism to control (Dh)−1), so the paper’s proof of Theorem 3.2 is incomplete . The candidate model solution supplies the missing stability/continuity argument: it assumes C1-approximation of the diffeomorphism h, uniform Hurwitz control for T, uniform approximation of v’s other blocks, then bounds V̂−V and invokes Grönwall to obtain uniform flow closeness on [0,T]×U. However, it also assumes (without citation) that the invertible coupling flow network can approximate diffeomorphisms in C1 on compact sets; the paper only cites universality “for invertible maps” and parameterization results, not explicit C1 density, so this key step remains an unproven hypothesis in the model’s proof. Therefore, both are incomplete: the paper for lacking the composition-to-flow proof and metrics, and the model for relying on an unstated/uncited C1 universality property of the invertible network. Relevant paper loci: Def. 3.1 and Thm. 3.2 (statement and informal justification) ; Thm. 2.6 (normal form) ; slow-manifold invariance/attraction (Thms. 2.8–2.9) ; component claims for h, T, B (Thms. 3.7, 3.9, 3.11) ; discrete IMEX flow-map discussion (3.21–3.22) .

Referee report (LaTeX)

\textbf{Recommendation:} major revisions

\textbf{Journal Tier:} strong field

\textbf{Justification:}

The core idea—hard-wiring Fenichel’s normal form into a neural ODE and enforcing an attracting slow manifold—is compelling and well executed empirically. The architectural components (negative Schur parameterization, low-rank bilinear network, invertible coupling flow with bi-Lipschitz control) are thoughtfully designed. However, the flagship theoretical claim (Theorem 3.2) lacks a complete proof: the passage from component-wise universal approximation to uniform flow approximation is not supplied, and necessary hypotheses (e.g., C1 approximation for h so that (Dh)−1 is controlled) are unstated. These issues are resolvable within the present framework and, once addressed, will significantly strengthen the paper’s correctness and impact.