Recent fusion breakeven [Abu-Shawareb et al., Phys. Rev. Lett. 132, 065102 (2024)] in the National Ignition Facility (NIF) motivates an integrated approach to data analysis from multiple diagnostics. Deep neural networks provide a seamless framework for multi-modal data fusion, automated data analysis, optimization, and uncertainty quantification [Wang et al., arXiv:2401.08390 (2024)]. Here, we summarize different neural network methods for x-ray and neutron imaging data from NIF. To compensate for the small experimental datasets, both model based physics-informed synthetic data generation and deep neural network methods, such as generative adversarial networks, have been successfully implemented to allow a variety of automated workflows in x-ray and neutron image processing. We highlight results in noise emulation, contour analysis for low-mode analysis and asymmetry, denoising, and super-resolution. Further advances in the integrated multi-modal imaging, in sync with experimental validation and uncertainty quantification, will help with the ongoing experimental optimization in NIF, as well as the maturation of alternate inertial confinement fusion (ICF) platforms such as double-shells.
© 2024 Author(s). Published under an exclusive license by AIP Publishing.