2507.11095
Performance Enhancement of the Recursive Least Squares Algorithms with Rank Two Updates
Alexander Stotsky
correctmedium confidence
- Category
- Not specified
- Journal tier
- Specialist/Solid
- Processed
- Sep 28, 2025, 12:56 AM
- arXiv Links
- Abstract ↗PDF ↗
Audit review
The paper states the RLSR2 updates (Ak = λAk−1 + Qk D Qk^T and the Γk, θk recursions) and then asserts the error model Ek = (I − Γk−1 Qk S−1 Qk^T) Ek−1 and θ̃k = (I − Γk−1 Qk S−1 Qk^T) θ̃k−1, followed by the Lyapunov one-step identity Vk − Vk−1 = −λ θ̃T k−1 Qk S−1 Qk^T θ̃k−1 − (1−λ)Vk−1, with the bound Vk ≤ λ^k V0 under a mild nonnegativity condition. These are exactly the statements the model proves, using the same update equations and the same assumption Ek = 0 for the Lyapunov step; the model simply fills in the missing algebra via identities implied by S = λD + Qk^T Γk−1 Qk. See the paper’s equations (3)–(5)/(10)–(11) and the stated error model (12)–(13) and one-step identity (14) for confirmation .
Referee report (LaTeX)
\textbf{Recommendation:} minor revisions \textbf{Journal Tier:} specialist/solid \textbf{Justification:} The manuscript presents a compact and well-motivated RLSR2 framework that blends exponential and instantaneous forgetting and connects it to standard RLS/Richardson/Newton–Schulz methods. The main algorithmic statements and claimed properties (error recursions and Lyapunov identity) check out, and the numerical motivation is sound. The chief limitation is that some derivations are only sketched or deferred to an extended version; modest expansions to include the key algebra and explicitly state the technical conditions (invertibility and nonnegativity) would significantly strengthen the paper.