X
Search Filters
Format Format
Subjects Subjects
Subjects Subjects
X
Sort by Item Count (A-Z)
Filter by Count
cma-es (10) 10
black-box optimization (9) 9
surrogate models (8) 8
benchmarking (7) 7
self-adaptation (7) 7
computer science (6) 6
computer science - neural and evolutionary computing (6) 6
surrogate-assisted optimization (6) 6
evolution strategy (5) 5
algorithms (4) 4
artificial intelligence (4) 4
evolution strategies (4) 4
optimization (4) 4
ranking support vector machine (4) 4
algorithm analysis and problem complexity (3) 3
computation by abstract devices (3) 3
discrete mathematics in computer science (3) 3
biological evolution (2) 2
computational biology/bioinformatics (2) 2
computer science, artificial intelligence (2) 2
computer science, theory & methods (2) 2
convergence (2) 2
covariance (2) 2
covariance matrices (2) 2
covariance matrix (2) 2
evolutionary computation (2) 2
evolutionäre berechnung (2) 2
mathematics (2) 2
matlab (2) 2
multiobjective optimization (2) 2
optimierungsproblem (2) 2
pattern recognition (2) 2
physics (2) 2
separable optimization (2) 2
support vector machine (2) 2
wind farm layout optimization (2) 2
windpark (2) 2
[info.info-ro]computer science [cs]/operations research [cs.ro] (1) 1
adaptation (1) 1
adaptive coordinate descent (1) 1
adaptive encoding (1) 1
adaptive object detection (1) 1
ai (1) 1
air-turbines (1) 1
artificial thinking (1) 1
benchmark testing (1) 1
benchmark-test (1) 1
bfgs (1) 1
bi-objective opti-mization (1) 1
bio-informatics (1) 1
bioinformatics (1) 1
biological informatics (1) 1
black box optimization (1) 1
black boxes (1) 1
buildings and facilities (1) 1
calculating (1) 1
checking-devices (1) 1
cholesky update (1) 1
cma (1) 1
coin-freed or like apparatus (1) 1
competition (1) 1
complexity (1) 1
complexity, computational (1) 1
computational complexity (1) 1
computational intelligence (1) 1
computer science - artificial intelligence (1) 1
computer simulation (1) 1
computer software (1) 1
computers (1) 1
computing (1) 1
conditioning (1) 1
congresses (1) 1
continuous optimization (1) 1
conveying (1) 1
counting (1) 1
covariance matrix adaptation (1) 1
covariance-matrix adaptation (1) 1
data mining and knowledge discovery (1) 1
data processing systems or methods, specially adapted foradministrative, commercial, financial, managerial, supervisoryor forecasting purposes (1) 1
design (1) 1
electronic brains (1) 1
electronic mail (1) 1
energy & fuels (1) 1
evolution (1) 1
evolutionary algorithm (1) 1
evolutionärer algorithmus (1) 1
finite difference method (1) 1
general (1) 1
global optimization (1) 1
green & sustainable science & technology (1) 1
handling record carriers (1) 1
handling thin or filamentary material (1) 1
hyper-heuristics (1) 1
ill-conditioned problems (1) 1
index medicus (1) 1
informatics (1) 1
information systems applications (1) 1
inteligencia artificial (1) 1
intellectronics (1) 1
intelligence, artificial (1) 1
more...
Language Language
Publication Date Publication Date
Click on a bar to filter by decade
Slide to change publication date range


IEEE Transactions on Evolutionary Computation, ISSN 1089-778X, 04/2019, Volume 23, Issue 2, pp. 353 - 358
The covariance matrix adaptation evolution strategy (CMA-ES) is a popular method to deal with nonconvex and/or stochastic optimization problems when gradient... 
Evolution (biology) | Evolutionary computation | Electronic mail | Covariance matrices | Time complexity | Optimization | Manganese | optimization | CMA | EVOLUTION STRATEGIES | TIME | COMPUTER SCIENCE, THEORY & METHODS | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE | BFGS | Algorithms | Covariance matrix | Covariance | Biological evolution | Adaptation
Journal Article
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), ISSN 0302-9743, 2014, Volume 8672, pp. 70 - 79
Journal Article
2013 IEEE Congress on Evolutionary Computation, ISSN 1089-778X, 06/2013, pp. 369 - 376
This paper investigates the performance of 6 versions of Covariance Matrix Adaptation Evolution Strategy (CMAES) with restarts on a set of 28 noiseless... 
Benchmark testing | Linear programming | Sociology | Covariance matrices | Optimization | Convergence
Conference Proceeding
IJCAI International Joint Conference on Artificial Intelligence, ISSN 1045-0823, 2018, Volume 2018-, pp. 1419 - 1426
Conference Proceeding
Evolutionary Computation, ISSN 1063-6560, 10/2015
The limited memory BFGS method (L-BFGS) of Liu and Nocedal (1989) is often considered to be the method of choice for continuous optimization when first- and/or... 
Neural and Evolutionary Computing | Computer Science | Mathematics | Optimization and Control
Journal Article
Journal Article
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), ISSN 0302-9743, 2012, Volume 7491, Issue 1, pp. 296 - 305
Conference Proceeding
Proceedings of the 2014 Annual Conference on genetic and evolutionary computation, 07/2014, pp. 397 - 404
We propose a computationally efficient limited memory Covariance Matrix Adaptation Evolution Strategy for large scale optimization, which we call the... 
large scale optimization | CMA-ES | cholesky update | evolution strategies | Large scale optimization | Cholesky update | Evolution Strategies
Conference Proceeding
11/2015
The limited memory BFGS method (L-BFGS) of Liu and Nocedal (1989) is often considered to be the method of choice for continuous optimization when first- and/or... 
Journal Article
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), ISSN 0302-9743, 2013, Volume 8309, pp. 182 - 197
Conference Proceeding
2010, Lecture Notes in Computer Science, ISBN 9783642158438, Volume 6238, Issue 1
Taking inspiration from approximate ranking, this paper investigates the use of rank-based Support Vector Machine as surrogate model within CMA-ES, enforcing... 
Computational Biology/Bioinformatics | Pattern Recognition | Discrete Mathematics in Computer Science | Computer Science | Artificial Intelligence (incl. Robotics) | Computation by Abstract Devices | Algorithm Analysis and Problem Complexity
Book Chapter
11/2017
L$_2$ regularization and weight decay regularization are equivalent for standard stochastic gradient descent (when rescaled by the learning rate), but as we... 
Journal Article
04/2014
We propose a computationally efficient limited memory Covariance Matrix Adaptation Evolution Strategy for large scale optimization, which we call the... 
Computer Science - Neural and Evolutionary Computing
Journal Article
Proceedings of the 2016 on genetic and evolutionary computation conference companion, 07/2016, pp. 1169 - 1176
We propose a multi-objective optimization algorithm aimed at achieving good anytime performance over a wide range of problems. Performance is assessed in terms... 
benchmarking | cma-es | multi-objective optimization | Benchmarking | Black-box optimization | Bi-objective opti-mization
Conference Proceeding
08/2016
Restart techniques are common in gradient-free optimization to deal with multimodal functions. Partial warm restarts are also gaining popularity in... 
Journal Article
2010, Lecture Notes in Computer Science, ISBN 3642172970, Volume 6457
Mainstream surrogate approaches for multi-objective problems build one approximation for each objective. Mono-surrogate approaches instead aim at... 
Information Systems Applications (incl.Internet) | Discrete Mathematics in Computer Science | Computer Science | Data Mining and Knowledge Discovery | Computation by Abstract Devices | Artificial Intelligence (incl. Robotics) | Simulation and Modeling
Book Chapter
05/2016
We propose a multi-objective optimization algorithm aimed at achieving good anytime performance over a wide range of problems. Performance is assessed in terms... 
Computer Science - Neural and Evolutionary Computing
Journal Article
04/2016
Hyperparameters of deep neural networks are often optimized by grid search, random search or Bayesian optimization. As an alternative, we propose to use the... 
Journal Article
11/2015
Deep neural networks are commonly trained using stochastic non-convex optimization procedures, which are driven by gradient information estimated on fractions... 
Journal Article
No results were found for your search.

Cannot display more than 1000 results, please narrow the terms of your search.