Nested sampling is a numerical tool widely used in cosmology for performing Bayesian data analysis: using astrophysical data and models of the universe to extract parameters such as its age and its size, as well as to numerically determine which model is preferred by the data.

This paper [2105.13923] showcases that the fundamental nested sampling technique is actually applicable in a far wider set of physical and statistical contexts, by applying it to a frequentist analysis commonly used in Particle Physics: that of computing the ‘p value’ for detection of a new particle.

The plot shows that while the current state-of-the-art Monte Carlo approach performs well for computing significances of 1 and 2 sigma, it becomes exponentially expensive for computing higher significances. Indeed, in order to reach the gold standard of ‘five sigma’ (as for example reached in the discovery of new particles such as the Higg’s Boson), it is shown that nested sampling is thousands of times more efficient, with the nested sampling algorithm MultiNest performing well in low dimensions d<30, but at higher dimensionalities as demanded by modern cosmological analyses, PolyChord becomes more efficient.

This thorough analysis was supported by a DiRAC grant for applying nested sampling to cosmology and particle physics analyses. The wide applicability and general Physics interest of this research was recognised by its publication in the high-impact flagship journal Physical Review Letters.

Categories: 2021 Highlights