2507.08738
Adaptive Nonlinear Vector Autoregression: Robust Forecasting for Noisy Chaotic Time Series
Azimov Sherkhon, Susana López-Moreno, Eric Dolores-Cuenca, Sieun Lee, Sangil Kim
correctmedium confidence
- Category
- math.DS
- Journal tier
- Specialist/Solid
- Processed
- Sep 28, 2025, 12:56 AM
- arXiv Links
- Abstract ↗PDF ↗
Audit review
Both the paper and the candidate solution argue that the adaptive NVAR (with a shallow tanh MLP replacing the fixed polynomial map) can match the expressivity of a standard NVAR. The paper appeals to a density result (Leshno et al. 1993) in Lp(μ) to claim that the MLP can approximate the polynomial feature vector, hence any NVAR mapping, and thus any system approximable by NVAR (stated succinctly in Section 5.1) . The candidate solution gives a more explicit, uniform-on-compacts construction, stacks scalar approximants to obtain a vector approximation of the quadratic feature map, and then propagates the error through a fixed linear readout—thereby supplying missing details and slightly stronger norms. Hence both are correct; the model’s proof is more complete and precise, but substantively aligned with the paper’s claim.
Referee report (LaTeX)
\textbf{Recommendation:} minor revisions \textbf{Journal Tier:} specialist/solid \textbf{Justification:} The work proposes a practical and empirically strong modification of NVAR, replacing fixed polynomial features with a learnable shallow MLP and end-to-end optimization. The theoretical claim about expressivity equivalence is valid in spirit but presented tersely; adding explicit statements, norms, and a short proof sketch would solidify correctness. With these clarifications, the paper would be a solid contribution for practitioners and researchers working on forecasting chaotic and noisy dynamical systems.