2206.12402
Predicting the Stability of Hierarchical Triple Systems with Convolutional Neural Networks
Florian Lalande, Alessandro Alberto Trani
correctmedium confidence
- Category
- Not specified
- Journal tier
- Specialist/Solid
- Processed
- Sep 28, 2025, 12:56 AM
- arXiv Links
- Abstract ↗PDF ↗
Audit review
The paper (arXiv:2206.12402) explicitly demonstrates a CNN trained on only the first 0.5% of each simulation (up to 5×10^5 inner periods) predicting hierarchical triple stability, with held-out test AUCs of 0.958 (architecture A) and 0.956 (architecture B), using labels defined by >15% change in a1/a2 before tf=10^8 P1. This matches the target setup and performance; therefore, the claim was not open at the cutoff. The candidate solution asserts the result was likely open as of 2022-06-24, which is contradicted by the paper’s documented methods and results.
Referee report (LaTeX)
\textbf{Recommendation:} minor revisions \textbf{Journal Tier:} specialist/solid \textbf{Justification:} A large-scale, carefully executed demonstration that early-time CNNs can predict the long-term stability of hierarchical triples with strong performance provides a valuable surrogate for expensive N-body runs. The approach is well-motivated, the dataset is substantial, and ablations are informative. Clarifications on selection effects, label sensitivity, probability calibration, and generalization beyond equal-mass Newtonian triples would strengthen the work.