mrpy.fitting.fit_sample.SimFit.run_mcmc

SimFit.run_mcmc(nchains=50, warmup=1000, iterations=1000, hs0=14.5, alpha0=-1.9, beta0=0.8, lnA0=-26.0, logm0=None, debug=0, opt_init=False, opt_kw=None, chainfile='chain.dat', save_latent=True, **kwargs)

Per-object MCMC fit for masses m, using the emcee package.

This creates an emcee.EnsembleSampler object with a correct model, and runs warmup and specified iterations. The entire emcee.EnsembleSampler object is returned, and stored in the instance of this class as mcmc_res. This affords greater flexibility, with the ability to run no warmup and 0 iterations, and run the iterations oneself.

Parameters:

nchains : int, optional

Number of chains to use in the AIES MCMC algorithm

warmup : int, optional

Number (discarded) warmup iterations.

iterations : int, optional

Number of iterations to keep in the chain.

hs0, alpha0, beta0: float, optional

Initial guess for each of the MRP parameters.

debug : int, optional

Set the level of info printed out throughout the function. Highest current level that is useful is 2.

opt_init : bool, optional

Whether to run a downhill optimization routine to get the best starting point for the MCMC.

opt_kw : dict, optional

Any arguments to pass to the downhill run.

kwargs :

Any other parameters to emcee.EnsembleSampler.

Returns:

mcmc_res : emcee.EnsembleSampler object

This object contains the stored chains, and other attributes.

Examples

The most obvious example is to generate a sample of variates from given parameters:

>>> from mrpy.base.stats import TGGD
>>> r = TGGD(scale=1e14,a=-1.8,b=1.0,xmin=1e12).rvs(1e5)

Then find the best-fit parameters for the resulting data:

>>> from mrpy.fitting.fit_sample import SampleFit
>>> FitObj = SampleFit(r)
>>> mcmc_res = FitObj.run_mcmc(nchains=10,warmup=100,iterations=100)
>>> print mcmc_res.flatchain.mean(axis=0)