minuit_optimizer

class pyhf.optimize.opt_minuit.minuit_optimizer(*args, **kwargs)[source]

Bases: pyhf.optimize.mixins.OptimizerMixin

Optimizer that minimizes via iminuit.Minuit.migrad().

__init__(*args, **kwargs)[source]

Create iminuit.Minuit optimizer.

Note

errordef should be 1.0 for a least-squares cost function and 0.5 for negative log-likelihood function. See page 37 of http://hep.fi.infn.it/minuit.pdf. This parameter is sometimes called UP in the MINUIT docs.

Parameters
  • errordef (float) – See minuit docs. Default is 1.0.

  • steps (int) – Number of steps for the bounds. Default is 1000.

  • strategy (int) – See iminuit.Minuit.strategy. Default is None.

  • tolerance (float) – Tolerance for termination. See specific optimizer for detailed meaning. Default is 0.1.

Attributes

errordef
maxiter
name
steps
strategy
tolerance
verbose

Methods

_get_minimizer(objective_and_grad, init_pars, init_bounds, fixed_vals=None, do_grad=False)[source]
_minimize(minimizer, func, x0, do_grad=False, bounds=None, fixed_vals=None, options={})[source]

Same signature as scipy.optimize.minimize().

Note: an additional minuit is injected into the fitresult to get the underlying minimizer.

Minimizer Options:
  • maxiter (int): Maximum number of iterations. Default is 100000.

  • strategy (int): See iminuit.Minuit.strategy. Default is to configure in response to do_grad.

  • tolerance (float): Tolerance for termination. See specific optimizer for detailed meaning. Default is 0.1.

Returns

the fit result

Return type

fitresult (scipy.optimize.OptimizeResult)