Journal of Radioanalytical and Nuclear Chemistry, cilt.334, sa.10, ss.7231-7253, 2025 (SCI-Expanded, Scopus)
This study introduces a novel, physics-informed, and calibration-friendly hybrid machine learning framework for the rapid and accurate prediction of the Full Energy Peak (FEP) efficiency in High-Purity Germanium (HPGe) detectors. To overcome the limitations of conventional “black-box” models, our two-stage approach first represents the FEP efficiency curve using a physically interpretable logarithmic polynomial. Subsequently, machine learning models were trained to predict the polynomial coefficients directly from the detector geometric parameters using a comprehensive dataset generated via Monte Carlo simulations. Among the various algorithms tested, the Generalized Linear Model yielded superior performance, achieving R2 values of 0.975–0.992 for the coefficients. While raw model predictions showed expected variability, a key feature of our framework (a single-point calibration protocol using one known efficiency value) dramatically improved accuracy, reducing the mean absolute percentage error by an average of 80% on the independent test data. The calibrated model was further validated using an experimental detector. The entire framework was deployed as a user-friendly online platform, enabling researchers to instantly generate and calibrate efficiency curves. Our work successfully harmonizes interpretability, speed, and accuracy, offering a powerful tool for the design, optimization, and routine calibration of HPGe detectors.