Back to search
2404.00199

An Efficient Sparse Identification Algorithm For Stochastic Systems With General Observation Sequences

Ziming Wang, Xinghua Zhu

correctmedium confidence
Category
Not specified
Journal tier
Strong Field
Processed
Sep 28, 2025, 12:56 AM

Audit review

The paper’s Algorithm 1 selects thresholds {α_n} with sqrt(log R_n / λ_min^n) = o(α_n) (Step 0), runs recursive LS (Step 1), then hard-thresholds to produce β_{n+1} (Step 2) . Under Assumptions 1–2, it proves (i) β_{n+1}(l) → θ(l) a.s. (Theorem 1) and (ii) finite-time correct identification of the zero set H_{N+1} = H* for all N≥N0(ω) (Theorem 2), relying on the standard LS error bound ∥θ_{n+1}−θ∥^2 ≤ C0·(log R_n)/λ_min^n (Lemma 1) and the chosen α_n scaling . The candidate solution assumes the same LS bound and non-excited condition, chooses α_n with ρ_n := sqrt(log R_n/λ_min^n) = o(α_n), and then proves (a) parameter convergence and (b) finite-time support recovery by showing zero coordinates are eventually thresholded to 0 and nonzeros stay above α_n. This mirrors the paper’s logic and uses the same key ingredients; the argument is slightly more explicit (e.g., introducing c_min = min_{i≤d}|θ(i)|>0) but imposes no extra substantive assumptions. Hence both are correct, and the proofs are substantially the same.

Referee report (LaTeX)

\textbf{Recommendation:} minor revisions

\textbf{Journal Tier:} strong field

\textbf{Justification:}

The manuscript presents a simple yet theoretically solid sparse identification algorithm that avoids penalty terms, proves finite-time support recovery under very weak excitation, and establishes almost-sure parameter convergence. The results are relevant to stochastic feedback systems where standard LASSO assumptions are hard to meet. The methodology and proofs are correct; small clarifications would improve readability and implementation guidance.