cma.es {cmaes} | R Documentation |
Global optimization procedure using a covariance matrix adapting evolutionary strategy.
cma.es(par, fn, ..., lower, upper, control = list())
par |
Initial values for the parameters to be optimized over. |
fn |
A function to be minimized (or maximized), with first argument the vector of parameters over which minimization is to take place. It should return a scalar result. |
... |
Further arguments to be passed to fn . |
lower, upper |
Bounds on the variables. |
control |
A list of control parameters. See ‘Details’. |
Note that arguments after ...
must be matched exactly.
By default this function performs minimization, but it will maximize
if control$fnscale
is negative. It tries to be a drop in
replacement for optim
.
The control
argument is a list that can supply any of the
following components:
fnscale
fn
during optimization. If negative,
turns the problem into a maximization problem. Optimization is
performed on fn(par)/fnscale
.maxit
100*D^2
, where D
is the dimension of the parameter space.stopfitness
stopfitness
. This is the only way for the CMA-ES
to "converge".sigma
D
, where D
is the dimension
of the parameter space.weights
damps
cs
ccum
ccov.1
Learning rate for rank-one update
\itemccov.mu
Learning rate for rank-mu update
A list with components:
par |
The best set of parameters found. |
value |
The value of fn corresponding to par . |
counts |
A two-element integer vector giving the number of calls
to fn . The second element is always zero for call
compatibility with optim . |
convergence |
An integer code. 0 indicates successful
convergence. Error codes are
|
message |
Always set to NULL , provided for call
compatibility with optim . |
Olaf Mersmann olafm@statistik.tu-dortmund.de
The code is based on the ‘purecmaes.m’ by N. Hansen.
Hansen, N. (2006). The CMA Evolution Strategy: A Comparing Review. In J.A. Lozano, P. Larranga, I. Inza and E. Bengoetxea (eds.). Towards a new evolutionary computation. Advances in estimation of distribution algorithms. pp. 75-102, Springer;
See Also optim
for traditional optimization methods.
## Compare performance of different algorithms on the shifted Rosenbrock function: ## Test dimension n <- 10 ## Random optimum in [-50, 50]^n opt <- runif(n, -50, 50) bias <- 0 f <- genShiftedRosenbrock(opt, bias) ## Inital parameter values start <- runif(n, -100, 100) res.nm <- optim(start, f, method="Nelder-Mead") res.gd <- optim(start, f, method="BFGS") res.cg <- optim(start, f, method="CG") res.sa <- optim(start, f, method="SANN") res.es <- cma.es(start, f)