Back to search
2205.00099

A New Least Squares Parameter Estimator for Nonlinear Regression Equations with Relaxed Excitation Conditions and Forgetting Factor

Romeo Ortega, Jose Guadalupe Romero, Stanislav Aranovskiy

correctmedium confidence
Category
Not specified
Journal tier
Strong Field
Processed
Sep 28, 2025, 12:56 AM

Audit review

Both the paper and the candidate solution derive the key interlaced identity Y(t) = Δ(t) G(θ) from the LS dynamics and use strong monotonicity (A1) plus interval excitation (A2) to obtain exponential decay of the parameter error. The candidate gives a slightly sharper, more explicit argument: an invertible closed form for F^{-1}, a clean formula H(t)=αA(t)[f0I+αA(t)]^{-1}, monotonicity of Δ(t), and an explicit post-transient lower bound Δ(t)≥δ>0 that yields an explicit rate. The paper’s proof sketches the same mechanism—LS identity (10) ⇒ extended NLPRE (11) ⇒ scalar regressor (12) ⇒ Lyapunov decay—and argues Δ(t) is PE under IE, which is sufficient for exponential convergence; it is correct though less explicit about δ and monotonicity of Δ. Hence both are correct and essentially the same proof at core, with the model providing additional details and constants (Proposition 1 and its proof, including equations (8)–(12), and the PE discussion around (9)–(11) in the paper confirm this structure).

Referee report (LaTeX)

\textbf{Recommendation:} minor revisions

\textbf{Journal Tier:} strong field

\textbf{Justification:}

The estimator and analysis are correct and relevant. The proof path—LS identity to an extended scalar regressor and Lyapunov analysis under IE—is sound. Adding an explicit factorization for H(t), a clear lower bound for Δ(t) after the IE window, and disambiguating notation would further improve clarity. The work has solid novelty and notable practical implications for adaptive estimation without persistent excitation.