Back to search
2209.15190

Neural Integral Equations

Emanuele Zappala, Antonio Henrique de Oliveira Fonseca, Josue Ortega Caro, David van Dijk

incompletemedium confidence
Category
math.DS
Journal tier
Strong Field
Processed
Sep 28, 2025, 12:56 AM

Audit review

The paper clearly states the NIE/ANIE integral-equation form y(t)=f(t)+∫_{α(t)}^{β(t)}G(y,t,s)ds and its specialization y(t)=f(t)+∫ K(t,s)F(y(s))ds, and notes that choosing α,β recovers the Fredholm (α=a,β=b) and Volterra (α=a,β=t) cases. It also claims iterative solution and points to standard existence/uniqueness results via fixed-point theorems, and interprets self-attention as an integral/quadrature approximation. However, it does not supply a complete, self-contained proof of a contraction regime, mapping properties on L^∞, or a rigorous mask-to-integral-limits convergence result; these are only referenced or discussed at a high level. By contrast, the candidate solution provides a correct Banach fixed-point argument on X=L^∞ with explicit hypotheses (L_f, L_F, M) yielding a contraction when L_f+ML_F<1, and a clean Riemann/consistent quadrature demonstration of how masked attention sums converge to Fredholm/Volterra integrals as the grid densifies. Hence, for the posed tasks, the paper is incomplete while the model’s solution is correct. See the paper’s formulation of the operator and Fredholm/Volterra limits, and its reliance on external theorems and the attention–Nyström connection for integration via self-attention (e.g., definitions and α,β choices; existence/uniqueness via Schauder/Tychonoff; solver iteration; and self-attention ≈ Nyström/Kernel quadrature) .

Referee report (LaTeX)

\textbf{Recommendation:} major revisions

\textbf{Journal Tier:} strong field

\textbf{Justification:}

The manuscript proposes a novel and practically valuable framework that learns dynamics via integral equations and leverages attention as an integration mechanism. The empirical section is strong. However, the theoretical aspects central to the solver's correctness and stability are only referenced or asserted at a high level. Providing in-paper theorems (with hypotheses and proofs) for a contraction regime and for attention-as-quadrature convergence would substantially strengthen the paper's rigor and readability.