Back to search
2209.03804

Kernel Methods for Regression in Continuous Time over Subsets and Manifolds

Nathan Powell, Jia Guo, Sai Tej Parachuri, John Burns, Boone Estes, Andrew Kurdila

correctmedium confidence
Category
Not specified
Journal tier
Specialist/Solid
Processed
Sep 28, 2025, 12:56 AM

Audit review

The paper derives the same two-term PE-based error bound for the projected/Galerkin regressor on a subspace (HS in the paper) as the candidate solution does for a general closed subspace V. Both use the same steps: (i) projected normal equations, (ii) subtracting the identity for ΠV G to get an error equation, (iii) a PE-induced lower spectral bound yielding a bounded-inverse estimate, and (iv) the operator-norm bound ‖Tφ(0,t)‖ ≤ K̄² ν([0,t]), culminating in the identical inequality when ν is Lebesgue and t = mΔ. The paper additionally contrasts the unprojected estimator in H (which produces a third “drift” term) with the projected one, matching the candidate’s explanation of where the projection is used to remove the drift. Minor differences are only in notation and in the paper’s explicit discussion of when PE on HS forces HS to be finite dimensional.

Referee report (LaTeX)

\textbf{Recommendation:} no revision

\textbf{Journal Tier:} specialist/solid

\textbf{Justification:}

The manuscript’s PE-based analysis for continuous-time RKHS regression is rigorous, well-scoped, and clearly demonstrates the role of projection in controlling drift. The derivations are standard but carefully executed, and the results should be useful to researchers working at the intersection of adaptive estimation, control, and kernel methods. The candidate solution independently reproduces a central bound with essentially the same proof strategy, corroborating the paper’s correctness.