Back to search
2205.11483

Learning differential equations from data

K. D. Olumoyin

incompletemedium confidence
Category
Not specified
Journal tier
Note/Short/Other
Processed
Sep 28, 2025, 12:56 AM

Audit review

The paper introduces a forward-Euler residual loss MSE(θ)=∑|−û_{i+1}+û_i+Δt·N[û;θ]|^2 (its Eq. (6)) and reports that integrating the learned N with an ODE solver reproduces trajectories on clean data but fails under 5% noise . However, it provides no formal identifiability analysis or noise-induced label perturbation bounds. The model’s solution correctly establishes: (i) the characterization of zero-loss minimizers on the training grid for noise-free data (conditional on realizability by the network class), (ii) the uniqueness argument that exact agreement of N with the true RHS along the trajectory implies exact reproduction by integration, and (iii) the noise-difference label perturbation δ_i=(ε_{i+1}−ε_i)/Δt with magnitude ≤2ε/Δt, explaining the empirical 5% noise failure reported in the paper . The only caveat is that the model’s claim MSE_min=0 tacitly assumes sufficient expressivity/realizability; this assumption is unstated in the paper and should be made explicit.

Referee report (LaTeX)

\textbf{Recommendation:} major revisions

\textbf{Journal Tier:} note/short/other

\textbf{Justification:}

This short note proposes a simple forward–Euler residual loss for learning ODE dynamics and demonstrates it on FHN. The empirical results are plausible and align with expectations, but the paper lacks theoretical analysis of identifiability, realizability, and noise sensitivity. Adding precise assumptions and short proofs (as in the model’s solution) would significantly improve rigor and value.