2506.12498
SINDybrid: automatic generation of hybrid models for dynamic systems
Ulderico Di Caprio, M. Enis Leblebici
correctmedium confidence
- Category
- Not specified
- Journal tier
- Specialist/Solid
- Processed
- Sep 28, 2025, 12:56 AM
- arXiv Links
- Abstract ↗PDF ↗
Audit review
The paper formulates the identification problem with an L1 loss and a per-column sparsity regularizer (P1, their Eq. (8)), and then gives a Mixed-Integer Linear Programming reformulation (P2, their Eq. (9)) using slack variables Y,Z for absolute values and an auxiliary integer s with constraint Σj δj − s ≤ 0 and a positive penalty on s; this is the standard linearization and is correct in intent, though no proof is provided in the paper text . The candidate solution supplies the missing optimality argument that at an optimum y = |XLΞ − hexp|, z = |Ξ|, and s = Σj δj, so the P2 objective reduces exactly to P1. The candidate further proves an exact-support recovery statement under noiseless realizability and a full-column-rank condition on XL (an additional assumption not claimed by the paper), which is coherent and mathematically sound as an extension. Overall: the paper’s reformulation is correct but informal; the model gives a correct, explicit proof and adds a reasonable uniqueness result under extra hypotheses.
Referee report (LaTeX)
\textbf{Recommendation:} minor revisions \textbf{Journal Tier:} specialist/solid \textbf{Justification:} The mathematical core—the L1 formulation (their Eq. (8)) and its MILP reformulation (their Eq. (9))—is sound. The paper explains the modeling choices and cites prior uses of similar MILP techniques but omits a brief optimality argument clarifying that y and z saturate to absolute values and that s equals the number of active columns at optimum. Adding these two sentences would complete the story and remove any doubt about exact equivalence. Experiments are well designed and support the method’s utility; tightening notation and explicitly stating small assumptions (e.g., λ1,ξ>0) would further strengthen clarity.