Back to search
2505.00460

Subspace-Distance-Enabled Active Learning for Efficient Data-Driven Model Reduction of Parametric Dynamical Systems

Harshit Kapadia, Peter Benner, Lihong Feng

incompletemedium confidence
Category
Not specified
Journal tier
Specialist/Solid
Processed
Sep 28, 2025, 12:56 AM

Audit review

The paper’s Theorem 2.1 claims D2 is a metric, but its triangle-inequality proof contains a crucial algebraic error: it identifies tr((PX−PY)^2) with the Frobenius norm ∥PX−PY∥F instead of the squared Frobenius norm ∥PX−PY∥F^2, see their (2.9) and subsequent use in (2.11)–(2.13). This breaks the inequality chain used to deduce the triangle inequality for D2 . By contrast, the model’s proof supplies a correct Minkowski-style argument in projector form that establishes the triangle inequality without that error.

Referee report (LaTeX)

\textbf{Recommendation:} major revisions

\textbf{Journal Tier:} specialist/solid

\textbf{Justification:}

The manuscript proposes a practical subspace distance D2 suitable for unequal dimensions and integrates it into an active learning ROM pipeline. However, the proof that D2 is a metric contains a pivotal algebraic error when transitioning from tr((PX−PY)\^2) to a Frobenius norm term, which undermines the triangle inequality argument. With a corrected proof (e.g., the provided projector-based derivation), the contribution becomes both correct and valuable.