2408.12540
Neural Fields and Noise-Induced Patterns in Neurons on Large Disordered Networks
Daniele Avitabile, James MacLaurin
correctmedium confidence
- Category
- Not specified
- Journal tier
- Strong Field
- Processed
- Sep 28, 2025, 12:56 AM
- arXiv Links
- Abstract ↗PDF ↗
Audit review
The paper proves almost-sure convergence of empirical measures and identifies a Gaussian mean-field limit via a large-deviation and contraction-mapping route; the candidate solution proves the same statements under the paper’s standing assumptions by a coupling-plus-concentration approach. The core statements (a.s. convergence in Y_T and the Gaussian limiting marginal with (m,V) solving the closed system) match the paper’s Theorem 3.9 and Lemma 3.10. Differences lie in proof technique and in the model’s provision of an explicit finite-n coupling bound; the paper does not claim rates but establishes a stronger LDP. No fatal gaps were found in either argument; some parts of the model proof would benefit from expanded sensitivity/Lipschitz estimates, but they are standard under the hypotheses.
Referee report (LaTeX)
\textbf{Recommendation:} minor revisions \textbf{Journal Tier:} strong field \textbf{Justification:} The paper gives a rigorous and conceptually clean derivation of an a.s. mean-field limit and Gaussian characterization for spatially structured, disordered neural networks with additive noise, by combining exponential equivalence, Sanov LDP, and a Lipschitz transformation of empirical measures. Assumptions are reasonable within the scope, and the arguments are correct. Minor edits would improve readability and clarifications around the connectivity hypothesis and the mapping construction.