2407.12293
Multi evolutional deep neural networks (Multi-EDNN)
Hadden Kim, Tamer A. Zaki
correctmedium confidence
- Category
- Not specified
- Journal tier
- Strong Field
- Processed
- Sep 28, 2025, 12:56 AM
- arXiv Links
- Abstract ↗PDF ↗
Audit review
The paper’s methodology defines solution and flux corrections that (i) make the assembled solution continuous across interfaces and (ii) enforce common interface-normal fluxes, with the auxiliary variable equal to the gradient of the corrected solution; it also introduces 1D monomial correction functions and states the C-EDNN cost reduction. The candidate solution re-derives these results with explicit proofs (e.g., differentiating under the boundary integral and showing interface-flux matching and global conservation), and verifies properties of the monomial family and the complexity scaling. Aside from a minor boundary-condition typo in the paper’s 1D cardinality statement, the two are consistent, with the model offering a more formal treatment. Key steps, definitions, and formulas align with the paper’s equations for correction, flux splitting, and complexity claims.
Referee report (LaTeX)
\textbf{Recommendation:} minor revisions \textbf{Journal Tier:} strong field \textbf{Justification:} The paper develops a coherent Multi-EDNN approach that coordinates multiple networks through interface corrections inspired by flux reconstruction. Its constructions are sound and demonstrated across canonical PDEs. Minor revisions—clarifying boundary conditions for the flux kernel, adding a concise conservation proof, and fixing a small endpoint typo—would strengthen rigor and clarity without changing the core contributions.