Back to search
2507.18220

Sparse identification of nonlinear dynamics with library optimization mechanism: Recursive long-term prediction perspective

Ansei Yonezawa, Heisei Yonezawa, Shuichi Yahagi, Itsuro Kajiwara, Shinya Kijimoto, Hikaru Taniuchi, Kentaro Murakami

incompletemedium confidence
Category
math.DS
Journal tier
Specialist/Solid
Processed
Sep 28, 2025, 12:57 AM

Audit review

The paper precisely defines the bilevel SINDy-LOM objective Jms, the inner sparse-regression map Ξ, the RLT prediction recursion, and the final model (its Eqs. (6)–(11)), but gives no existence/continuity guarantees and explicitly treats the outer problem as nonconvex, to be handled by heuristics such as GA/PSO, with R(v)=γ||v||0 in practice . The candidate solution supplies a correct existence/continuity and sparsity argument under explicit additional assumptions (compact Φ-domain, continuity of θi, nonzero normalizers, deterministic tie-breaking, and optionally a small ℓ2 ridge). These hypotheses are not stated (nor used) in the paper, but under them the candidate’s reasoning is sound by standard arguments (finite-support decomposition and Weierstrass) and aligns with the paper’s definitions of Jms and the model recursion .

Referee report (LaTeX)

\textbf{Recommendation:} minor revisions

\textbf{Journal Tier:} specialist/solid

\textbf{Justification:}

The paper offers a clear, useful bilevel formulation (SINDy-LOM) that addresses a practical gap in SINDy by optimizing the library for long-horizon predictive reliability. The presentation is strong and the empirical validation is compelling. However, it omits minimal theoretical conditions ensuring that the outer objective is well-posed and that inner solutions are selected consistently. Adding a concise set of assumptions and a short remark on existence would strengthen the work without altering its core algorithmic message.