mrpy.fitting.fit_sample.SimFit.run_downhill

SimFit.run_downhill(hs0=14.5, alpha0=-1.9, beta0=0.8, lnA0=-40.0, debug=0, jac=True, **minimize_kw)

Downhill-gradient optimization.

Parameters:

hs0, alpha0, beta0, lnA0: float, optional

Initial guess for each of the MRP parameters.

debug : int, optional

Set the level of info printed out throughout the function. Highest current level that is useful is 2.

jac : bool, optional

Whether to use analytic jacobian (usually a good idea)

minimize_kw : dict

Any other parameters to scipy.optimize.minimize().

Returns:

downhill_res : OptimizeResult

The optimization result represented as a OptimizeResult object (see scipy documentation). Important attributes are: x the solution array, success a Boolean flag indicating if the optimizer exited successfully and message which describes the cause of the termination.

The parameters are ordered by logHs, alpha, beta, [lnA].

downhill_obj : mrpy.likelihoods.PerObjLikeWeights object

An object containing the solution parameters and methods to access relevant quantities, such as the mass function, or jacobian and hessian at the solution.

Examples

The most obvious example is to generate a sample of variates from given parameters:

>>> from mrpy.base.stats import TGGD
>>> r = TGGD(scale=1e14,a=-1.8,b=1.0,xmin=1e12).rvs(1e5)

Then find the best-fit parameters for the resulting data:

>>> from mrpy.fitting.fit_sample import SampleFit
>>> FitObj = SampleFit(r)
>>> res,obj = FitObj.run_downhill()
>>> print res.x

We can also use the obj object to explore some of the qualities of the fit

>>> print obj.hessian
>>> print obj.cov

The results are also stored in the class as downhill_obj and downhill_res.

>>> from matplotlib.pyplot import plot
>>> plot(FitObj.downhill_obj.logm,FitObj.downhill_obj.dndm(log=True))
>>> print obj.stats.mean, r.mean()