2406.06707
Discovery of differential equations using sparse state and parameter regression
Teddy Meissner, Karl Glasner
correcthigh confidence
- Category
- Not specified
- Journal tier
- Specialist/Solid
- Processed
- Sep 28, 2025, 12:56 AM
- arXiv Links
- Abstract ↗PDF ↗
Audit review
The paper defines the mixed-regression losses Lλ and Lλ,R, introduces the smooth l0 surrogate ∥θ∥ε = Σi(1 − exp(−θi2/(2ε2))) and notes that limε→0 ∥·∥ε = ∥·∥0, then uses BIC for model selection and an acceptance rule that updates bestIC only upon strict improvement (eqs. (4), (5), (7), (9) and Algorithm 1) . The candidate solution supplies the missing mathematical details: a pointwise proof of the ε→0+ limit with monotonicity and an explicit exponential-in-1/ε2 error bound; explicit first-order gradients via the chain rule (with a selection matrix S formalizing u|D); and a formal statement that the recorded best BIC is nonincreasing and convergent. These are consistent with the paper’s setup and claims. One caveat: the candidate’s further claim that the sequence becomes constant after finitely many iterations requires an additional assumption (e.g., that each mask’s subproblem is solved to a unique global minimum and that accepted masks are never revisited). The paper does not make this claim, so there is no conflict; with this mild caveat, both are correct, with the model providing a more explicit proof of properties the paper states informally.
Referee report (LaTeX)
\textbf{Recommendation:} minor revisions \textbf{Journal Tier:} specialist/solid \textbf{Justification:} The manuscript presents a coherent, practical framework for discovering governing equations under noise and incomplete data by combining mixed regression, a smooth l0 surrogate, and BIC-based model comparison. The methodology is well-motivated and computationally thoughtful (e.g., Hessian sparsity exploitation). To bolster rigor and clarity, the authors should add short formal statements on the surrogate's limiting behavior and clarify acceptance/termination conditions for the selection algorithm.