logo
首页技术栈工具库讨论
ultranest
ultranest
When scientific models are compared to data, two tasks are important: 1) contraining the model parameters and 2) comparing the model to other models. Different techniques have been developed to explore model parameter spaces. This package implements a Monte Carlo technique called nested sampling. Nested sampling allows Bayesian inference on arbitrary user-defined likelihoods. In particular, posterior probability distributions on model parameters are constructed, and the marginal likelihood (“evidence”) Z is computed. The former can be used to describe the parameter constraints of the data, the latter can be used for model comparison (via Bayes factors) as a measure of the prediction parsimony of a model. In the last decade, multiple variants of nested sampling have been developed. These differ in how nested sampling finds better and better fits while respecting the priors (constrained likelihood prior sampling techniques), and whether it is allowed to go back to worse fits and explore the parameter space more. This package develops novel, advanced techniques for both (See https://johannesbuchner.github.io/UltraNest/method.html). They are especially remarkable for being free of tuning parameters and theoretically justified. Beyond that, UltraNest has support for Big Data sets and high-performance computing applications. UltraNest is intended for fitting complex physical models with slow likelihood evaluations, with one to hundreds of parameters. UltraNest intends to replace heuristic methods like multi-ellipsoid nested sampling and dynamic nested sampling with more rigorous methods. UltraNest also attempts to provide feature parity compared to other packages (such as MultiNest).