minuit_optimizer
- class pyhf.optimize.opt_minuit.minuit_optimizer(*args, **kwargs)[source]
Bases:
pyhf.optimize.mixins.OptimizerMixin
Optimizer that minimizes via
iminuit.Minuit.migrad()
.- __init__(*args, **kwargs)[source]
Create
iminuit.Minuit
optimizer.Note
errordef
should be 1.0 for a least-squares cost function and 0.50 for negative log-likelihood function — see MINUIT: Function Minimization and Error Analysis Reference Manual Section 7.1: Function normalization and ERROR DEF. This parameter is sometimes calledUP
in theMINUIT
docs.- Parameters
errordef (
float
) – See minuit docs. Default is1.0
.steps (
int
) – Number of steps for the bounds. Default is1000
.strategy (
int
) – Seeiminuit.Minuit.strategy
. Default isNone
.tolerance (
float
) – Tolerance for termination. See specific optimizer for detailed meaning. Default is0.1
.
Attributes
- name
- errordef
- steps
- strategy
- tolerance
- maxiter
- verbose
Methods
- _get_minimizer(objective_and_grad, init_pars, init_bounds, fixed_vals=None, do_grad=False, par_names=None)[source]
- _minimize(minimizer, func, x0, do_grad=False, bounds=None, fixed_vals=None, options={})[source]
Same signature as
scipy.optimize.minimize()
.Note: an additional minuit is injected into the fitresult to get the underlying minimizer.
- Minimizer Options:
maxiter (
int
): Maximum number of iterations. Default is100000
.strategy (
int
): Seeiminuit.Minuit.strategy
. Default is to configure in response todo_grad
.tolerance (
float
): Tolerance for termination. See specific optimizer for detailed meaning. Default is0.1
.
- Returns
the fit result
- Return type
fitresult (scipy.optimize.OptimizeResult)