TY - GEN
T1 - Comparison-based natural gradient optimization in high dimension
AU - Akimoto, Youhei
AU - Auger, Anne
AU - Hansen, Nikolaus
PY - 2014/1/1
Y1 - 2014/1/1
N2 - We propose a novel natural gradient based stochastic search algorithm, VD-CMA, for the optimization of high dimensional numerical functions. The algorithm is comparisonbased and hence invariant to monotonic transformations of the objective function. It adapts a multivariate normal distribution with a restricted covariance matrix with twice the dimension as degrees of freedom, representing an arbitrarily oriented long axis and additional axis-parallel scaling. We derive the different components of the algorithm and show linear internal time and space complexity. We find empirically that the algorithm adapts its covariance matrix to the inverse Hessian on convex-quadratic functions with an Hessian with one short axis and different scaling on the diagonal. We then evaluate VD-CMA on test functions and compare it to different methods. On functions covered by the internal model of VD-CMA and on the Rosenbrock function, VD-CMA outperforms CMA-ES (having quadratic internal time and space complexity) not only in internal complexity but also in number of function calls with increasing dimension.
AB - We propose a novel natural gradient based stochastic search algorithm, VD-CMA, for the optimization of high dimensional numerical functions. The algorithm is comparisonbased and hence invariant to monotonic transformations of the objective function. It adapts a multivariate normal distribution with a restricted covariance matrix with twice the dimension as degrees of freedom, representing an arbitrarily oriented long axis and additional axis-parallel scaling. We derive the different components of the algorithm and show linear internal time and space complexity. We find empirically that the algorithm adapts its covariance matrix to the inverse Hessian on convex-quadratic functions with an Hessian with one short axis and different scaling on the diagonal. We then evaluate VD-CMA on test functions and compare it to different methods. On functions covered by the internal model of VD-CMA and on the Rosenbrock function, VD-CMA outperforms CMA-ES (having quadratic internal time and space complexity) not only in internal complexity but also in number of function calls with increasing dimension.
KW - Covariance matrix adaptation
KW - Hessian matrix
KW - Information geometric optimization
KW - Natural gradient
KW - Theory
UR - https://www.scopus.com/pages/publications/84905714909
U2 - 10.1145/2576768.2598258
DO - 10.1145/2576768.2598258
M3 - Conference contribution
AN - SCOPUS:84905714909
SN - 9781450326629
T3 - GECCO 2014 - Proceedings of the 2014 Genetic and Evolutionary Computation Conference
SP - 373
EP - 380
BT - GECCO 2014 - Proceedings of the 2014 Genetic and Evolutionary Computation Conference
PB - Association for Computing Machinery
T2 - 16th Genetic and Evolutionary Computation Conference, GECCO 2014
Y2 - 12 July 2014 through 16 July 2014
ER -