AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |
Back to Blog
Square unfolder5/1/2023 ![]() If means is set to None, the truth distribution is used. tFirstDerivativePrior(fb, means) forces the usage of a first-derivative-based prior, with the bias distribution If fb is set to zero, this is equivalent to the tCurvaturePrior(fb, means) forces the usage of a curvature-based prior, with a bias distribution givenīy fb * means. tEntropyPrior() forces the usage of an entropy-based prior. Should have the same size as the number of truth bins. ![]() With the means given in the means array and standard deviations given in the widths array. tGaussianPrior(widths, means) forces the usage of a Gaussian bin-by-bin uncorrelated Gaussian prior Model = Unfolder(bkg, mig, eff, truth) creates an instance of the Unfolder class. They have the advantage to also be able to readĪ ROOT TH1 or TH2 histogram. You can also send the inputs to the software using those classes. Internally, all information is stored using those classes to guarantee that error propagation is done correctly. The software also comes with two histogram classes: H1D and H2D, as well as plotting functions for them, plotH1D and Plots of the bias as a function ofĪlpha are generated according to the plot file names in the last 3 arguments. In this example, 1000 toys are generated and alpha is varied from 0 to 10 in steps of 0.5. Where alt_bkg, alt_mig, alt_eff are the parameters that describe the model to use to generate toys. "norm.png") # bias mean and variance in the normalisation vs alpha "chi2.png", # sum of bias mean^2/variance for all bins vs alpha "bias.png", # bias mean and variance vs alpha Np.arange(0.0, 10, 0.5), # range of alpha to probe OptAlpha, optChi2, optBias, optStd, optNorm, optNormStd = model.scanAlpha(alt_bkg, alt_mig, alt_eff, # alt. One can add systematic uncertainties in the unfolding model as follows: The response matrix P(reco = j|truth = i)*efficiency(i) is now stored in model.response.įor convenience, the same matrix without the efficiency multiplication is stored in model.response_noeff. #tEntropyPrior() # Uses an entropy based prior. #tCurvaturePrior(fb = 1.0) # Uses a curvature based prior with a reference distribution if fb = 1.0. ![]() #tFirstDerivativePrior(fb = 1.0) # Uses a first-derivative based prior with a reference distribution if fb = 1.0. # are given, they will be used as the means and widths of the Gaussians #tGaussianPrior(mean, sd) # If vectors (with the size of the truth distribution number of bins) # and width in each bin given by sqrt(truth) #tGaussianPrior() # For a Gaussian prior with means at the truth bins #tUniformPrior() # Using a uniform prior is the default Model = Unfolder.Unfolder(bkg, mig, eff, truth) # Call the constructor to initialise the model parameters You can use install Numpy, Matplotlib, SciPy, Pandas, Seaborn, pyMC3 and Sympy in Ubuntu as follows. More information on HMC can be found in Installing pre-requisites More information on NUTS can be found in More information about PyMC3 can be found in and. The current software aims to extend it to PyMC3, so that one can use more efficient sampling techniques, such as There is already a software that implements this in, but it uses PyMC2. This software uses PyMC3 to unfold a distribution, using the Fully Bayesian Unfolding (FBU) method,
0 Comments
Read More
Leave a Reply. |