Optimizer

Optimizer class is responsible for maximizing a fitness function. Our approach uses gradient free global optimization methods (evolutionary algorithms, genetic algorithms, Bayesian optimization). We provided access to two libraries.

Nevergrad

Offers an extensive collection of algorithms that do not require gradient computation. NevergradOptimizer can be specified in the following way:

opt = NevergradOptimizer(method='PSO')

where method input is a string with specific optimization algorithm.

Available methods include:
  • Differential evolution. ['DE']
  • Covariance matrix adaptation.['CMA']
  • Particle swarm optimization.['PSO']
  • Sequential quadratic programming.['SQP']

Nevergrad is still poorly documented, to check all the available methods use the following code:

from nevergrad.optimization import registry
print(sorted(registry.keys()))

Scikit-Optimize (skopt)

Skopt implements several methods for sequential model-based (“blackbox”) optimization and focuses on bayesian methods. Algorithms are based on scikit-learn minimize function.

Available Methods:
  • Gaussian process-based minimization algorithms ['GP']
  • Sequential optimization using gradient boosted trees ['GBRT']
  • Sequential optimisation using decision trees ['ET']
  • Random forest regressor ['RF']

User can also provide a custom made sklearn regressor. SkoptOptimizer can be specified in the following way:

Parameters:

  • method = ["GP", "RF", "ET", "GBRT" or sklearn regressor, default="GP"]
  • n_initial_points [int, default=10]
  • acq_func
  • acq_optimizer
  • random_state

For more detail check Optimizer documentation. https://scikit-optimize.github.io/#skopt.Optimizer

opt = SkoptOptimizer(method='GP', acq_func='LCB')

Custom Optimizer

To use a different back-end optimization library, user can provide a custom class that inherits from provided abstract class Optimizer

User can plug in different optimization tool, as long as it follows an ask() / tell interface. The abstract class Optimizer is prepared for different back-end libraries. All of the optimizer specific arguments have to be provided upon optimizers initialization.

The ask() / tell interface is used as follows:

parameters = optimizer.ask()

errors = simulator.run(parameters)

optimizer.tell(parameters, errors)
results = optimizer.recommend()