# pysamoo: Surrogate-Assisted Multi-objective Optimization

In practice, most optimization problems in practice consist of one or multiple computationally expensive objective or constraint functions to which special attention must be paid during algorithm design. Most commonly, so-called surrogates (also known as metamodels or simply approximation models) are utilized during optimization to learn from previous evaluations and exploit this knowledge in future iterations. pysamoo is an extension of pymoo - a comprehensive toolbox for multi-objective optimization - focusing on solving optimization problems with computationally expensive objective or constraint functions.

Please find the Github repository here: https://github.com/anyoptimization/pysamoo

Please find an overview of this software documentation below:

Overview

• Installation: How to install the current release of pysamoo.

• Algorithms: An overview of algorithms and their underlying concepts.

• Usage: Instructions and code snippets to execute algorithms.

• Contact: Information to contact the framework’s leading developer.

 Algorithm Framework multi-objective constrained Description SSA-NSGA-II None yes yes A simple surrogate-assisted variant of the well-known NSGA-II algorithm. PSAF-GA PSAF no no An implementation using genetic algorithms assisted by surrogates. PSAF-DE PSAF no no An implementation of surrogate-assisted particle swarm optimization. PSAF-CMAES PSAF no no The popular CMAES method with surrogate assistance. GPSAF-GA GPSAF no yes An implementation using genetic algorithms assisted by surrogates with constrained handling GPSAF-DE GPSAF no yes An implementation of surrogate-assisted particle swarm optimization with constrained handling. GPSAF-CMAES GPSAF no no The popular CMAES method with surrogate assistance. GPSAF-ISRES GPSAF no yes A surrogate-assisted variant of ISRES, a well known algorithm for constrained single-objective optimization problems GPSAF-NSGA-II GPSAF yes yes A surrogate-assisted variant of NSGA-II, a well known algorithm for bi-objective optimization GPSAF-NSGA-III GPSAF yes yes An extension of GPSAF-NSGA-II for many-objective optimization problems.

GNU Affero General Public License (AGPL): The GNU Affero General Public License is a modified version of the ordinary GNU GPL version 3. It has one added requirement: if you run a modified program on a server and let other users communicate with it there, your server must also allow them to download the source code corresponding to the modified version running there.

Warning

# Citation

Note

If you use this framework, we kindly ask you to cite the following paper:

Julian Blank, & Kalyanmoy Deb. (2022). pysamoo: Surrogate-Assisted Multi-Objective Optimization in Python.

@misc{pysamoo,
title={pysamoo: Surrogate-Assisted Multi-Objective Optimization in Python},
author={Julian Blank and Kalyanmoy Deb},
year={2022},
eprint={2204.05855},
archivePrefix={arXiv},
primaryClass={cs.NE}
}


# Installation

The most recent STABLE release of pysamoo can be installed by:

pip install -U pysamoo


Feel free to experiment with the instable release by:

pip install -i https://test.pypi.org/simple/ --extra-index-url https://pypi.org/simple pysamoo


# Algorithms

In this part of the software documentation, some more words about the algorithms being implemented in the framework shall be said.

Commonly, surrogates – approximation or interpolation models – are utilized during optimization to improve the convergence behavior. First, one shall distinguish between two different types of evaluations: ESEs that require to run the computationally expensive evaluation; and ASEs which is a computationally inexpensive approximation by the surrogate. Where the overall optimization run is limited by $$\texttt{ESE}^{\max}$$ function evaluation, function calls of ASEs are only considered as algorithmic overhead.

## Simple Surrogate Assisted (SSA)

In order to improve the convergence of NSGA-II, the surrogates provide ASEs and let the algorithm look several iterations into the future without any evaluation of ESEs. The surrogate models are used to create a set of infill solutions as follows: First, NSGA-II is run for $$k$$ more iterations (starting from the best solutions found so far), returning the solution set $$X^{\texttt{(cand)}}$$. The number of solutions in $$X^{\texttt{(cand)}}$$ corresponds to the population size of the algorithm. After eliminating duplicates in $$X^{\texttt{(cand)}}$$, the number of solutions $$N$$ desired to run using ESEs needs to be selected. The selection first creates $$N$$ clusters (in the objective space based on $$X^{\texttt{(cand)}}$$) using the k-means algorithm and then uses a roulette wheel selection based on the predicted crowding distances. Note that this will introduce a bias towards boundary points as they have been depicted with a crowding distance of infinity. Altogether, this results in $$N$$ solutions to be then evaluated using ESEs in this optimization cycle.

## PSAF

In contrast to most existing surrogate-assisted algorithms, PSAF uses not only the final solution(s) obtained by optimizing the surrogate but the whole search pattern. By making use of the search pattern, the exploration-exploitation balance is found by taking the surrogate’s accuracy into account. To allow even more flexible exploitation of the surrogate, we propose two phases. First, derive a solution set that is influenced by the surrogate, and second, introduce surrogate bias by optimizing the surrogate for a number of iterations. Both procedures are important to incorporate surrogates into existing methods effectively.

One of the major challenges when proposing a generalized optimization framework is the number and strictness of assumptions being made. On the one hand, too many assumptions restrict the applicability; on the other hand, too few assumptions limit the usage of existing elements in algorithms. In this study, we target any type of population-based algorithm with two phases in an iteration: the process of generating new solutions to be evaluated (infill) and a method processing evaluated infill solutions (advance). So, how can existing optimization methods be described into infill and advance phases? Genetic algorithms (GAs) generate new solutions using evolutionary recombination-mutation operators and then process them using an environmental survival selection operator; PSO methods create new solutions based on a particles’ current velocity, personal best, and global best, and process the solutions using a replacement strategy; CMAES samples new solutions from a normal distribution, which is then updated in each iteration. Shown by well-known state-of-the-art algorithms following or being suitable to be implemented in this optimization method design pattern, this seems to be a reasonable assumption to be made for a generic framework. Moreover, it is worth noting that some researchers and practitioners also refer to the pattern as ask-and-tell interface.

### $$\alpha$$-Phase

A well-known concept in evolutionary computation to introduce a bias toward more promising solutions is tournament selection. An individual from the population has to win a tournament to contribute to the mating process. The number of competitors ($$\alpha$$) balances how greedy the selection procedure will be. On the one hand, a larger value of $$\alpha$$ allows only elitist solutions to participate in mating, while a smaller value introduces less selection pressure. For genetic algorithms, the most frequently used tournament mode is the binary tournament ($$\alpha=2$$), which compares a pair of solutions regarding one or multiple metrics. A standard binary tournament implementation for constrained single-objective optimization declares the less infeasible solution as the winner if one or both solutions are infeasible or otherwise the solution with the smaller function value.

In the context of surrogate assistance, the tournament selection introduces surrogate bias during the generation of new infill solutions. Whereas in genetic algorithms, evaluated solutions (using ESE) compete with each other during mating selection, in PSAF solutions evaluated on the surrogate (ASE) are compared.

### $$\beta$$-Phase

While the tournament is an effective concept to incorporate the surrogate’s approximation, it is limited by looking only a single iteration into the future. To further increase the surrogate’s impact, the baseline algorithm is continued to run for $$\beta$$ more consecutive iterations on the surrogate’s approximations. Inevitably, the question of how many iterations are suitable arises and indicates the importance of tuning $$\beta$$. Nevertheless, even more critical, how should the algorithm profit from simulating the algorithm on the surrogate? An inappropriate choice of $$\beta$$ will cause the surrogate’s optimum to be repeatedly found and will entirely discard the baseline algorithm’s default infill procedure. This also causes a diversity loss of infill solutions and does not account for the surrogate’s approximation error. Thus, we propose a probabilistic surrogate-assisted approach that balances the surrogate’s impact on the baseline algorithm to address these issues.

An example with five iterations ($$\beta = 5$$) and four infill solutions $$X_1$$, $$X_2$$, $$X_3$$, and $$X_4$$ is also illustrated in the figure below. Calling the infill function of the baseline algorithm results in five solution sets with four solutions each. When running the algorithm, the assignment takes place, and for instance, $$X_1$$ has four solutions being the closest to, and $$X_4$$ has six. The assignment of the closest solution will show cluster-like arrangements and preserve diversity.

Publication:

Julian Blank and Kalyanmoy Deb. 2021. PSAF: a probabilistic surrogate-assisted framework for single-objective optimization. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO ‘21). Association for Computing Machinery, New York, NY, USA, 652â€“659.

BibTex:

@inproceedings{10.1145/3449639.3459297,
author = {Blank, Julian and Deb, Kalyanmoy},
title = {PSAF: A Probabilistic Surrogate-Assisted Framework for Single-Objective Optimization},
year = {2021},
isbn = {9781450383509},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3449639.3459297},
doi = {10.1145/3449639.3459297},
booktitle = {Proceedings of the Genetic and Evolutionary Computation Conference},
pages = {652â€“659},
numpages = {8},
keywords = {simulation optimization, metamodel-based optimization, surrogate-assisted optimization, genetic algorithms, evolutionary computing},
location = {Lille, France},
series = {GECCO '21}
}


## GPSAF

Next, PSAF shall be extended to be suitable to handle multiple objectives and constraints. GPSAF follows the two-phase concept as PSAF. However, the $$\alpha$$-phase and the $$\beta$$-phase now have to consider multiple criteria when comparing solutions.

• Extension of PSAF to constrained and multi-objective optimization

• What needs to be modified?

• Multiple Surrogates: One for each constraint, one for each objective

• Solution Comparisons: Instead of comparing only the objective, the constraint satisfaction and Pareto Dominance now need to be considered.

• Exploration vs. Exploitation: The bottle variable rho needs to be redefined.

For single-objective optimization, the $$\alpha$$-phase has already been described. There, the comparison of the two solutions is based only on one single objective value. For the more generic version with constrainAnalogously to PSAF, GPSAF further increases the surrogate’s impact by looking $$\beta$$ iterations into the future through calling infill and advance of the baseline algorithm repetitively. To obtain the $$\beta$$-solution for constrained multi-objective problems, we use a so-called Probabilistic Knockout Tournament (PKT) to select solutions from each cluster with the goal of self-adaptively exploiting surrogates. The goal is to use surrogates more when they provide accurate predictions but use them more carefully when they provide only rough estimations. Necessary for generalization, PKT also applies to problems with multiple objectives and constraints, often with varying complexities and surrogate errors to be considered.ts and objectives, the winner of each solution pool is determined as follows: if all solutions are infeasible, select the least infeasible solution; otherwise, select a non-dominated solution (break ties randomly). For both the constraint and objective values, only ASEs are used. Otherwise, the $$\alpha$$-phase remains the same, including its responsibilities and mechanics.

Analogously to PSAF, GPSAF further increases the surrogate’s impact by looking $$\beta$$ iterations into the future through calling infill and advance of the baseline algorithm repetitively. To obtain the $$\beta$$-solution for constrained multi-objective problems, we use a so-called PKT to select solutions from each cluster with the goal of self-adaptively exploiting surrogates. The goal is to use surrogates more when they provide accurate predictions but use them more carefully when they provide only rough estimations. Necessary for generalization, PKT also applies to problems with multiple objectives and constraints, often with varying complexities and surrogate errors to be considered.

Publication:

Julian Blank and Kalyanmoy Deb. 2022. GPSAF: A Generalized Probabilistic Surrogate-Assisted Framework for Constrained Single- and Multi-objective Optimization. COINLab Report 202204.

BibTex:

@misc{gpsaf,
doi = {10.48550/ARXIV.2204.04054},
url = {https://arxiv.org/abs/2204.04054},
author = {Blank, Julian and Deb, Kalyanmoy},
keywords = {Optimization and Control (math.OC), Machine Learning (cs.LG), Mathematical Software (cs.MS), FOS: Mathematics, FOS: Mathematics, FOS: Computer and information sciences, FOS: Computer and information sciences, G.1.6; G.1.2; I.6.3, 68U07},
title = {GPSAF: A Generalized Probabilistic Surrogate-Assisted Framework for Constrained Single- and Multi-objective Optimization},
publisher = {arXiv},
year = {2022},
}


# Usage

In general, pysamoo uses the main functionalities of pymoo for defining the optimization problem. However, it provides a new set of algorithms designed for computationally expensive functions.

## SSA-NSGA-II

For bi-objective optimization problems, a variant of NSGA-II called Simple Surrogate Assisted NSGA-II (SSA-NSGA-II) could be a good starting point.

[1]:

from pymoo.optimize import minimize
from pymoo.problems.multi.zdt import ZDT1
from pymoo.visualization.scatter import Scatter
from pysamoo.algorithms.ssansga2 import SSANSGA2

problem = ZDT1(n_var=10)

algorithm = SSANSGA2(n_initial_doe=50,
n_infills=10,
surr_pop_size=100,
surr_n_gen=50)

res = minimize(
problem,
algorithm,
('n_evals', 200),
seed=1,
verbose=True)

plot = Scatter()
plot.show()

============================================================
n_gen |  n_eval |     igd      |      gd      |      hv
============================================================
1 |      50 |  1.526442823 |  2.253783461 |  0.00000E+00
2 |      60 |  0.046396914 |  0.300196586 |  0.595018331
3 |      70 |  0.033947341 |  0.008993312 |  0.610204749
4 |      80 |  0.024488745 |  0.010265171 |  0.627719544
5 |      90 |  0.023731220 |  0.013908376 |  0.630506522
6 |     100 |  0.020993013 |  0.012455222 |  0.635631372
7 |     110 |  0.019087553 |  0.012077546 |  0.638694173
8 |     120 |  0.018454795 |  0.009261683 |  0.639605933
9 |     130 |  0.017399517 |  0.009701286 |  0.641156067
10 |     140 |  0.016730949 |  0.012260403 |  0.641573319
11 |     150 |  0.015736094 |  0.012211167 |  0.643410891
12 |     160 |  0.014104293 |  0.012119646 |  0.645610005
13 |     170 |  0.012296655 |  0.010243625 |  0.648737520
14 |     180 |  0.011535042 |  0.009461263 |  0.650443282
15 |     190 |  0.010349809 |  0.008950225 |  0.651784655
16 |     200 |  0.009829221 |  0.009597486 |  0.652961699

[1]:

<pymoo.visualization.scatter.Scatter at 0x7fb86fa9ed60>


## PSAF-GA

[2]:

from pymoo.algorithms.soo.nonconvex.ga import GA
from pymoo.optimize import minimize
from pymoo.problems.single import Ackley
from pysamoo.algorithms.psaf import PSAF

problem = Ackley(n_var=10)

algorithm = GA(pop_size=20, n_offsprings=10)

algorithm = PSAF(algorithm, n_initial_doe=30, alpha=10, beta=30, max_rho=0.7, n_max_infills=10, n_max_doe=500)

res = minimize(
problem,
algorithm,
('n_evals', 300),
seed=2,
verbose=True)

print("Best solution found: \nX = %s\nF = %s\nCV=%s" % (res.X, res.F, res.CV))

=========================================================================================================
n_gen |  n_eval |     fopt     |   fopt_gap   |     favg     |     bias     |     mae      |    model
=========================================================================================================
2 |      30 |  2.00067E+01 |  2.00067E+01 |  2.08792E+01 |            - |  0.295669478 | RBF[kernel=linear,tail=linear,normalized=True]
3 |      40 |  1.25358E+01 |  1.25358E+01 |  1.95328E+01 |  0.700000000 |  0.581973574 | RBF[kernel=linear,tail=linear,normalized=True]
4 |      50 |  1.25358E+01 |  1.25358E+01 |  1.70467E+01 |  0.700000000 |  0.831762378 | RBF[kernel=linear,tail=constant,normalized=True]
5 |      60 |  1.24920E+01 |  1.24920E+01 |  1.48254E+01 |  0.700000000 |  0.980943312 | RBF[kernel=linear,tail=constant,normalized=True]
6 |      70 |  1.24920E+01 |  1.24920E+01 |  1.41739E+01 |  0.700000000 |  0.809506177 |     krg-cont
7 |      80 |  7.524701433 |  7.524701433 |  1.17633E+01 |  0.700000000 |  1.061379318 |     krg-cont
8 |      90 |  6.231815096 |  6.231815096 |  8.987145219 |  0.700000000 |  0.739570348 |     krg-cont
9 |     100 |  4.436184698 |  4.436184698 |  6.836947753 |  0.700000000 |  0.688046885 |      krg-lin
10 |     110 |  4.033330927 |  4.033330927 |  5.477620620 |  0.725546230 |  0.620491761 |      krg-lin
11 |     120 |  3.454903145 |  3.454903145 |  4.480489282 |  0.742306740 |  0.659343091 |      krg-lin
12 |     130 |  2.815285833 |  2.815285833 |  3.889942377 |  0.735535953 |  0.614470498 |      krg-lin
13 |     140 |  2.485750595 |  2.485750595 |  3.363607960 |  0.700000000 |  0.627636124 |      krg-lin
14 |     150 |  2.485750595 |  2.485750595 |  3.197909089 |  0.700000000 |  0.790408089 |      krg-lin
15 |     160 |  2.476723582 |  2.476723582 |  2.982454112 |  0.700000000 |  0.867001279 |      krg-lin
16 |     170 |  2.434431373 |  2.434431373 |  2.776993874 |  0.700000000 |  0.845437156 |      krg-lin
17 |     180 |  2.126776496 |  2.126776496 |  2.714992663 |  0.700000000 |  0.729268722 |      krg-lin
18 |     190 |  2.078240035 |  2.078240035 |  2.500405462 |  0.700000000 |  1.806320892 | RBF[kernel=cubic,tail=linear+quadratic,normalized=False]
19 |     200 |  2.078240035 |  2.078240035 |  2.500405462 |  0.700000000 |  1.321117683 |     krg-cont
20 |     210 |  1.653515436 |  1.653515436 |  2.303239380 |  0.700000000 |  1.264441877 |     krg-cont
21 |     220 |  1.442352553 |  1.442352553 |  2.087226240 |  0.700000000 |  1.294728627 |     krg-cont
22 |     230 |  0.824849738 |  0.824849738 |  1.688611006 |  0.700000000 |  1.240271042 |     krg-cont
23 |     240 |  0.708080126 |  0.708080126 |  1.471188774 |  0.700000000 |  1.233600703 |     krg-cont
24 |     250 |  0.707589803 |  0.707589803 |  1.226136628 |  0.700000000 |  0.493833637 | RBF[kernel=gaussian,tail=constant,normalized=True]
25 |     260 |  0.707589803 |  0.707589803 |  1.201866513 |  0.700000000 |  1.284121721 |     krg-cont
26 |     270 |  0.707589803 |  0.707589803 |  1.147485885 |  0.700000000 |  1.407075473 |     krg-cont
27 |     280 |  0.520014186 |  0.520014186 |  0.923809691 |  0.700000000 |  1.446130744 |     krg-cont
28 |     290 |  0.518572748 |  0.518572748 |  0.761520652 |  0.700000000 |  1.460985476 |     krg-cont
29 |     300 |  0.434843502 |  0.434843502 |  0.611597245 |  0.700000000 |  1.385878913 |     krg-cont
Best solution found:
X = [-0.12688657 -0.00297504  0.04256559 -0.00187105  0.04402346  0.00120705
-0.01615695  0.06042537 -0.10188824 -0.06181256]
F = [0.4348435]
CV=[0.]


## PSAF-DE

[3]:

from pymoo.algorithms.soo.nonconvex.de import DE
from pymoo.optimize import minimize
from pymoo.problems.single import Ackley
from pysamoo.algorithms.psaf import PSAF

problem = Ackley(n_var=10)

algorithm = DE(pop_size=20, n_offsprings=10)

algorithm = PSAF(algorithm, n_initial_doe=30, alpha=10, beta=30, max_rho=0.7, n_max_infills=10, n_max_doe=500)

res = minimize(
problem,
algorithm,
('n_evals', 300),
seed=2,
verbose=True)

print("Best solution found: \nX = %s\nF = %s\nCV=%s" % (res.X, res.F, res.CV))

=========================================================================================================
n_gen |  n_eval |     fopt     |   fopt_gap   |     favg     |     bias     |     mae      |    model
=========================================================================================================
2 |      30 |  2.00067E+01 |  2.00067E+01 |  2.08792E+01 |            - |  0.295669478 | RBF[kernel=linear,tail=linear,normalized=True]
3 |      40 |  1.44799E+01 |  1.44799E+01 |  1.93934E+01 |  0.700000000 |  0.799409015 | RBF[kernel=linear,tail=linear,normalized=True]
4 |      50 |  1.44799E+01 |  1.44799E+01 |  1.91167E+01 |  0.700000000 |  0.928172336 | RBF[kernel=linear,tail=constant,normalized=True]
5 |      60 |  1.31103E+01 |  1.31103E+01 |  1.89794E+01 |  0.700000000 |  1.040848676 | RBF[kernel=linear,tail=linear,normalized=True]
6 |      70 |  1.09561E+01 |  1.09561E+01 |  1.86909E+01 |  0.700000000 |  1.154333194 |     krg-cont
7 |      80 |  9.420417041 |  9.420417041 |  1.83074E+01 |  0.700000000 |  1.180671386 |     krg-cont
8 |      90 |  9.420417041 |  9.420417041 |  1.81428E+01 |  0.700000000 |  0.647627100 |     krg-cont
9 |     100 |  8.365940100 |  8.365940100 |  1.80568E+01 |  0.700000000 |  0.554954396 |     krg-cont
10 |     110 |  8.024922715 |  8.024922715 |  1.78922E+01 |  0.741153019 |  0.527368981 |     krg-cont
11 |     120 |  7.396652392 |  7.396652392 |  1.72090E+01 |  0.825083790 |  0.518345385 |      krg-lin
12 |     130 |  4.868026774 |  4.868026774 |  1.70488E+01 |  0.816386754 |  0.517685002 |      krg-lin
13 |     140 |  4.868026774 |  4.868026774 |  1.67959E+01 |  0.799027499 |  0.519939569 |      krg-lin
14 |     150 |  4.868026774 |  4.868026774 |  1.67142E+01 |  0.744865385 |  0.518127439 |      krg-lin
15 |     160 |  4.868026774 |  4.868026774 |  1.66530E+01 |  0.741296851 |  0.547838549 |      krg-lin
16 |     170 |  4.868026774 |  4.868026774 |  1.66045E+01 |  0.720500231 |  0.540898946 |     krg-cont
17 |     180 |  4.868026774 |  4.868026774 |  1.62857E+01 |  0.728552239 |  0.576995856 |     krg-cont
18 |     190 |  4.868026774 |  4.868026774 |  1.56730E+01 |  0.740742001 |  0.616144531 |     krg-cont
19 |     200 |  4.801632381 |  4.801632381 |  1.55472E+01 |  0.765039254 |  0.760504208 |     krg-cont
20 |     210 |  4.801632381 |  4.801632381 |  1.55067E+01 |  0.700000000 |  0.810652396 |     krg-cont
21 |     220 |  4.801632381 |  4.801632381 |  1.53688E+01 |  0.700000000 |  0.813004751 |      krg-lin
22 |     230 |  4.712859472 |  4.712859472 |  1.51936E+01 |  0.700000000 |  0.760527480 |      krg-lin
23 |     240 |  4.061073431 |  4.061073431 |  1.51177E+01 |  0.700000000 |  0.681421676 |      krg-lin
24 |     250 |  4.061073431 |  4.061073431 |  1.51046E+01 |  0.700000000 |  0.531587588 |      krg-lin
25 |     260 |  3.619382893 |  3.619382893 |  1.49276E+01 |  0.704480041 |  0.388432617 |      krg-lin
26 |     270 |  3.619382893 |  3.619382893 |  1.48554E+01 |  0.827923624 |  0.441110858 |      krg-lin
27 |     280 |  3.535249723 |  3.535249723 |  1.48261E+01 |  0.700000000 |  0.494988780 |      krg-lin
28 |     290 |  3.414219581 |  3.414219581 |  1.47411E+01 |  0.700000000 |  0.510668380 |      krg-lin
29 |     300 |  2.717046992 |  2.717046992 |  1.46942E+01 |  0.700000000 |  0.516918929 |      krg-lin
Best solution found:
X = [-0.07750256 -0.63698302 -0.00699189 -0.13305234 -0.71384496  0.12294314
0.16717794 -0.35720554  0.24511258 -0.14820306]
F = [2.71704699]
CV=[0.]


## PSAF-CMAES

[4]:

from pymoo.algorithms.soo.nonconvex.cmaes import SimpleCMAES
from pymoo.optimize import minimize
from pymoo.problems.single import Ackley
from pysamoo.algorithms.psaf import PSAF

problem = Ackley(n_var=10)

algorithm = SimpleCMAES(pop_size=20)

algorithm = PSAF(algorithm, n_initial_doe=30, alpha=10, beta=30, max_rho=0.7, n_max_infills=10, n_max_doe=500)

res = minimize(
problem,
algorithm,
('n_evals', 300),
seed=2,
verbose=True)

print("Best solution found: \nX = %s\nF = %s\nCV=%s" % (res.X, res.F, res.CV))

=========================================================================================================
n_gen |  n_eval |     fopt     |   fopt_gap   |     favg     |     bias     |     mae      |    model
=========================================================================================================
2 |      30 |  2.00067E+01 |  2.00067E+01 |  2.08792E+01 |            - |  0.295669478 | RBF[kernel=linear,tail=linear,normalized=True]
3 |      40 |  1.18365E+01 |  1.18365E+01 |  1.62963E+01 |  0.700000000 |  0.929692620 | RBF[kernel=linear,tail=linear,normalized=True]
4 |      50 |  1.18365E+01 |  1.18365E+01 |  1.57489E+01 |  0.700000000 |  1.182626727 | RBF[kernel=linear,tail=constant,normalized=True]
5 |      60 |  1.18365E+01 |  1.18365E+01 |  1.62275E+01 |  0.700000000 |  1.397991439 | RBF[kernel=linear,tail=linear,normalized=True]
6 |      70 |  1.09457E+01 |  1.09457E+01 |  1.54529E+01 |  0.700000000 |  1.648505909 |     krg-cont
7 |      80 |  6.533311680 |  6.533311680 |  1.07105E+01 |  0.700000000 |  1.825972978 |     krg-cont
8 |      90 |  6.533311680 |  6.533311680 |  1.31478E+01 |  0.700000000 |  1.253685736 |     krg-cont
9 |     100 |  5.536903012 |  5.536903012 |  9.122003134 |  0.700000000 |  0.907588724 |     krg-cont
10 |     110 |  5.536903012 |  5.536903012 |  1.04466E+01 |  0.850476128 |  0.886723286 |     krg-cont
11 |     120 |  3.606023869 |  3.606023869 |  7.930313180 |  0.879655769 |  0.887938147 |     krg-cont
12 |     130 |  3.606023869 |  3.606023869 |  6.775835323 |  0.883996295 |  0.988008018 |     krg-cont
13 |     140 |  3.606023869 |  3.606023869 |  7.405047260 |  0.814311314 |  1.109251424 |     krg-cont
14 |     150 |  3.554418048 |  3.554418048 |  6.533350830 |  0.732990295 |  1.102437071 |     krg-cont
15 |     160 |  3.554418048 |  3.554418048 |  6.835437951 |  0.711012725 |  1.152080594 |      krg-lin
16 |     170 |  3.310282127 |  3.310282127 |  5.174924208 |  0.700000000 |  1.065966219 |      krg-lin
17 |     180 |  2.969732370 |  2.969732370 |  4.697683678 |  0.700000000 |  0.821900196 |      krg-lin
18 |     190 |  2.969732370 |  2.969732370 |  5.369859845 |  0.700000000 |  0.811711587 |      krg-lin
19 |     200 |  2.969732370 |  2.969732370 |  5.051947329 |  0.700000000 |  0.930113008 |      krg-lin
20 |     210 |  2.969732370 |  2.969732370 |  4.527261684 |  0.700000000 |  0.918888950 |     krg-cont
21 |     220 |  2.969732370 |  2.969732370 |  3.876461054 |  0.700000000 |  1.015796645 |     krg-cont
22 |     230 |  2.969732370 |  2.969732370 |  4.094044762 |  0.700000000 |  1.074563507 |     krg-cont
23 |     240 |  2.969732370 |  2.969732370 |  4.221977758 |  0.700000000 |  0.991782291 |     krg-cont
24 |     250 |  2.969732370 |  2.969732370 |  4.075217974 |  0.700000000 |  0.925522867 |     krg-cont
25 |     260 |  2.969732370 |  2.969732370 |  3.731054056 |  0.700000000 |  0.895323496 |      krg-lin
26 |     270 |  2.469264070 |  2.469264070 |  3.499506787 |  0.700000000 |  0.876986526 |      krg-lin
27 |     280 |  1.248383248 |  1.248383248 |  3.334046888 |  0.700000000 |  0.723206719 | RBF[kernel=mq,tail=constant,normalized=True]
28 |     290 |  1.248383248 |  1.248383248 |  8.795659864 |  0.700000000 |  6.06439E+03 | RBF[kernel=gaussian,tail=quadratic,normalized=True]
29 |     300 |  1.248383248 |  1.248383248 |  1.04755E+01 |  0.700000000 |  1.622633898 |     krg-cont
Best solution found:
X = [ 0.073688    0.12611394  0.21745451  0.05631034 -0.28150535  0.07995362
-0.12594537  0.04513843 -0.06595957  0.05103607]
F = [1.24838325]
CV=[0.]


## GPSAF-GA

[5]:

from pymoo.algorithms.soo.nonconvex.ga import GA
from pymoo.optimize import minimize
from pymoo.problems.single import Ackley
from pysamoo.algorithms.gpsaf import GPSAF

problem = Ackley(n_var=10)

algorithm = GA(pop_size=20, n_offsprings=10)

algorithm = GPSAF(algorithm, n_initial_doe=30, alpha=10, beta=30, n_max_infills=10, n_max_doe=500)

res = minimize(
problem,
algorithm,
('n_evals', 300),
seed=2,
verbose=True)

print("Best solution found: \nX = %s\nF = %s\nCV=%s" % (res.X, res.F, res.CV))

=========================================================================================================
n_gen |  n_eval |     fopt     |   fopt_gap   |     favg     | n_influenced |     mae      |    model
=========================================================================================================
2 |      30 |  2.03679E+01 |  2.03679E+01 |  2.11218E+01 |            - |  0.239553735 | RBF[kernel=gaussian,tail=quadratic,normalized=True]
3 |      40 |  1.54071E+01 |  1.54071E+01 |  2.00529E+01 |         2/10 |  0.375380271 | RBF[kernel=gaussian,tail=quadratic,normalized=True]
4 |      50 |  1.54071E+01 |  1.54071E+01 |  1.85399E+01 |         2/10 |  0.566694558 | RBF[kernel=gaussian,tail=quadratic,normalized=True]
5 |      60 |  1.50642E+01 |  1.50642E+01 |  1.65616E+01 |         8/10 |  0.788562645 | RBF[kernel=linear,tail=constant,normalized=True]
6 |      70 |  1.49632E+01 |  1.49632E+01 |  1.55563E+01 |         7/10 |  0.875499159 | RBF[kernel=linear,tail=linear,normalized=True]
7 |      80 |  1.40722E+01 |  1.40722E+01 |  1.52481E+01 |         2/10 |  1.032806639 | RBF[kernel=linear,tail=constant,normalized=True]
8 |      90 |  1.40722E+01 |  1.40722E+01 |  1.50859E+01 |         4/10 |  0.850240497 | kriging-lin-ARD
9 |     100 |  1.11271E+01 |  1.11271E+01 |  1.41018E+01 |         5/10 |  0.852619885 | kriging-lin-ARD
10 |     110 |  9.862148820 |  9.862148820 |  1.26316E+01 |         5/10 |  0.861746557 | kriging-lin-ARD
11 |     120 |  9.089850699 |  9.089850699 |  1.07926E+01 |         4/10 |  0.653958805 | kriging-lin-ARD
12 |     130 |  8.365462379 |  8.365462379 |  9.641716361 |         4/10 |  0.763669813 | kriging-quadr
13 |     140 |  7.213496632 |  7.213496632 |  9.169209856 |         2/10 |  0.747123340 | kriging-const-ARD
14 |     150 |  5.518798604 |  5.518798604 |  8.311198673 |         2/10 |  0.696688095 | kriging-const-ARD
15 |     160 |  5.166341432 |  5.166341432 |  7.250085359 |         4/10 |  0.745274001 | kriging-const-ARD
16 |     170 |  4.715293949 |  4.715293949 |  6.190406085 |         3/10 |  0.851376307 | kriging-const-ARD
17 |     180 |  4.506389436 |  4.506389436 |  5.441595504 |         3/10 |  0.880704806 | kriging-const-ARD
18 |     190 |  3.602899195 |  3.602899195 |  5.080318965 |         5/10 |  0.832961294 | kriging-const-ARD
19 |     200 |  3.449609475 |  3.449609475 |  4.829779206 |         8/10 |  0.582696071 | kriging-quadr
20 |     210 |  2.896292947 |  2.896292947 |  4.326263670 |         3/10 |  0.954391856 | kriging-const-ARD
21 |     220 |  2.875140994 |  2.875140994 |  3.842885253 |         5/10 |  0.895850569 | kriging-const-ARD
22 |     230 |  2.875140994 |  2.875140994 |  3.630956323 |         7/10 |  0.980938095 | kriging-const-ARD
23 |     240 |  2.875140994 |  2.875140994 |  3.408055362 |         8/10 |  0.628303629 | kriging-const
24 |     250 |  1.997437256 |  1.997437256 |  3.178345845 |         5/10 |  0.596562419 | kriging-const
25 |     260 |  1.997437256 |  1.997437256 |  3.019568595 |         6/10 |  0.355156925 | kriging-quadr
26 |     270 |  1.997437256 |  1.997437256 |  2.914276289 |         9/10 |  0.521409800 | kriging-quadr
27 |     280 |  1.891430539 |  1.891430539 |  2.617365610 |         6/10 |  0.946184555 | kriging-const
28 |     290 |  1.891430539 |  1.891430539 |  2.480695613 |         7/10 |  1.101435095 | kriging-const
29 |     300 |  1.686540303 |  1.686540303 |  2.243963562 |         5/10 |  1.263357719 | kriging-const
Best solution found:
X = [-0.01017884  0.11274214  0.15408955 -0.04909597 -0.2538259  -0.07108236
0.82884658 -0.00939967  0.0244004  -0.05127961]
F = [1.6865403]
CV=[0.]


## GPSAF-DE

[6]:

from pymoo.algorithms.soo.nonconvex.de import DE
from pymoo.optimize import minimize
from pymoo.problems.single import Ackley
from pysamoo.algorithms.gpsaf import GPSAF

problem = Ackley(n_var=10)

algorithm = DE(pop_size=20, n_offsprings=10)

algorithm = GPSAF(algorithm, n_initial_doe=30, alpha=10, beta=30, n_max_infills=10, n_max_doe=500)

res = minimize(
problem,
algorithm,
('n_evals', 300),
seed=2,
verbose=True)

print("Best solution found: \nX = %s\nF = %s\nCV=%s" % (res.X, res.F, res.CV))

=========================================================================================================
n_gen |  n_eval |     fopt     |   fopt_gap   |     favg     | n_influenced |     mae      |    model
=========================================================================================================
2 |      30 |  2.03679E+01 |  2.03679E+01 |  2.11218E+01 |            - |  0.239553735 | RBF[kernel=gaussian,tail=quadratic,normalized=True]
3 |      40 |  1.78125E+01 |  1.78125E+01 |  2.07458E+01 |         6/10 |  0.509548028 | RBF[kernel=gaussian,tail=quadratic,normalized=False]
4 |      50 |  1.68242E+01 |  1.68242E+01 |  2.05594E+01 |         4/10 |  0.685218247 | RBF[kernel=gaussian,tail=quadratic,normalized=False]
5 |      60 |  1.41108E+01 |  1.41108E+01 |  2.01638E+01 |         7/10 |  0.809356002 | RBF[kernel=linear,tail=linear,normalized=True]
6 |      70 |  1.35749E+01 |  1.35749E+01 |  2.01384E+01 |         4/10 |  0.983685938 | RBF[kernel=gaussian,tail=quadratic,normalized=False]
7 |      80 |  1.08758E+01 |  1.08758E+01 |  1.97375E+01 |         8/10 |  1.288386170 | RBF[kernel=gaussian,tail=quadratic,normalized=False]
8 |      90 |  1.08758E+01 |  1.08758E+01 |  1.96707E+01 |         6/10 |  0.631725721 |  kriging-lin
9 |     100 |  1.08758E+01 |  1.08758E+01 |  1.93563E+01 |         8/10 |  0.672191691 |  kriging-lin
10 |     110 |  1.08758E+01 |  1.08758E+01 |  1.91950E+01 |         8/10 |  0.627515741 |  kriging-lin
11 |     120 |  9.820629978 |  9.820629978 |  1.90045E+01 |         8/10 |  0.600822520 | kriging-const
12 |     130 |  7.571886452 |  7.571886452 |  1.85637E+01 |         8/10 |  0.525814310 | kriging-const-ARD
13 |     140 |  7.256897722 |  7.256897722 |  1.83942E+01 |         7/10 |  0.572336042 |  kriging-lin
14 |     150 |  7.256897722 |  7.256897722 |  1.80074E+01 |         6/10 |  0.519187690 |  kriging-lin
15 |     160 |  7.256897722 |  7.256897722 |  1.77815E+01 |         6/10 |  0.507572174 |  kriging-lin
16 |     170 |  5.121337727 |  5.121337727 |  1.76856E+01 |         6/10 |  0.540304131 |  kriging-lin
17 |     180 |  5.121337727 |  5.121337727 |  1.76234E+01 |         6/10 |  0.408079937 | kriging-quadr
18 |     190 |  5.119648172 |  5.119648172 |  1.76234E+01 |         7/10 |  0.652695011 |  kriging-lin
19 |     200 |  5.119648172 |  5.119648172 |  1.75096E+01 |         9/10 |  0.596740525 | kriging-lin-ARD
20 |     210 |  5.119648172 |  5.119648172 |  1.74706E+01 |         8/10 |  0.764295217 | kriging-lin-ARD
21 |     220 |  5.119648172 |  5.119648172 |  1.73236E+01 |         9/10 |  0.847162066 | kriging-lin-ARD
22 |     230 |  5.119648172 |  5.119648172 |  1.72700E+01 |         7/10 |  0.894036590 | kriging-lin-ARD
23 |     240 |  5.119648172 |  5.119648172 |  1.72254E+01 |         7/10 |  0.549283389 | kriging-quadr-ARD
24 |     250 |  5.119648172 |  5.119648172 |  1.71836E+01 |         8/10 |  1.057122741 | kriging-lin-ARD
25 |     260 |  5.119648172 |  5.119648172 |  1.71545E+01 |         6/10 |  0.980904945 | kriging-lin-ARD
BIASED: TOO CLOSE (SKIP)
26 |     270 |  5.119648172 |  5.119648172 |  1.70505E+01 |         8/10 |  0.941280543 | kriging-lin-ARD
27 |     280 |  4.709865701 |  4.709865701 |  1.69467E+01 |         6/10 |  0.882364065 | kriging-lin-ARD
28 |     290 |  4.315938952 |  4.315938952 |  1.68187E+01 |         9/10 |  0.832540588 | kriging-lin-ARD
29 |     300 |  4.158064584 |  4.158064584 |  1.67729E+01 |         9/10 |  0.608495736 | kriging-quadr-ARD
Best solution found:
X = [-0.01065628  0.0958028  -0.51572614 -0.19052951  0.15756506 -0.92264625
1.68625079 -0.10290413 -1.11300629  0.49102388]
F = [4.15806458]
CV=[0.]


## GPSAF-ISRES

[7]:

from pymoo.algorithms.soo.nonconvex.isres import ISRES
from pymoo.factory import get_problem
from pymoo.optimize import minimize
from pysamoo.algorithms.gpsaf import GPSAF

problem = get_problem("g1")

algorithm = ISRES()

algorithm = GPSAF(algorithm,
alpha=3,
beta=30,
n_max_infills=1)

res = minimize(
problem,
algorithm,
('n_evals', 50),
seed=1,
verbose=True)

print("Best solution found: \nX = %s\nF = %s\nCV=%s" % (res.X, res.F, res.CV))

=========================================================================================================
n_gen |  n_eval |   cv (min)   |   cv (avg)   |     fopt     |   fopt_gap   |     favg     | n_influenced
=========================================================================================================
2 |      27 |  1.90500E+01 |  6.24216E+01 |            - |            - |            - |            -
3 |      28 |  0.00000E+00 |  6.01923E+01 | -3.94366E+00 |  1.10563E+01 | -3.94366E+00 |          1/1
4 |      29 |  0.00000E+00 |  5.81167E+01 | -6.28168E+00 |  8.718321045 | -5.11267E+00 |          1/1
5 |      30 |  0.00000E+00 |  5.50714E+01 | -6.28168E+00 |  8.718321045 | -5.37742E+00 |          1/1
6 |      31 |  0.00000E+00 |  5.13344E+01 | -6.55996E+00 |  8.440040515 | -5.67306E+00 |          1/1
7 |      32 |  0.00000E+00 |  4.93399E+01 | -7.88559E+00 |  7.114409437 | -6.11556E+00 |          1/1
8 |      33 |  0.00000E+00 |  4.74805E+01 | -1.28810E+01 |  2.118956391 | -7.24314E+00 |          1/1
9 |      34 |  0.00000E+00 |  4.43963E+01 | -1.30277E+01 |  1.972319276 | -8.06951E+00 |          1/1
10 |      35 |  0.00000E+00 |  4.15136E+01 | -1.41539E+01 |  0.846056592 | -8.83006E+00 |          1/1
11 |      36 |  0.00000E+00 |  3.87791E+01 | -1.46778E+01 |  0.322163692 | -9.47981E+00 |          1/1
12 |      37 |  0.00000E+00 |  3.72407E+01 | -1.48420E+01 |  0.157984600 | -1.00160E+01 |          1/1
13 |      38 |  0.00000E+00 |  3.44313E+01 | -1.49503E+01 |  0.049743053 | -1.04646E+01 |          1/1
14 |      39 |  0.00000E+00 |  3.14390E+01 | -1.49771E+01 |  0.022919358 | -1.08406E+01 |          1/1
15 |      40 |  0.00000E+00 |  3.07705E+01 | -1.49847E+01 |  0.015345104 | -1.11594E+01 |          1/1
16 |      41 |  0.00000E+00 |  2.79306E+01 | -1.49956E+01 |  0.004419555 | -1.14334E+01 |          1/1
17 |      42 |  0.00000E+00 |  2.45578E+01 | -1.49980E+01 |  0.002028637 | -1.16711E+01 |          1/1
18 |      43 |  0.00000E+00 |  2.20519E+01 | -1.49992E+01 |  0.000781034 | -1.18791E+01 |          1/1
19 |      44 |  0.00000E+00 |  1.99841E+01 | -1.49997E+01 |  0.000285144 | -1.20626E+01 |          1/1
20 |      45 |  0.00000E+00 |  1.80961E+01 | -1.49999E+01 |  0.000122219 | -1.22258E+01 |          1/1
21 |      46 |  0.00000E+00 |  1.66403E+01 | -1.49999E+01 |  0.000050135 | -1.23718E+01 |          1/1
22 |      47 |  0.00000E+00 |  1.66403E+01 | -1.50000E+01 |  0.000023230 | -1.29537E+01 |          1/1
23 |      48 |  0.00000E+00 |  1.41835E+01 | -1.50000E+01 |  0.000012194 | -1.30560E+01 |          1/1
24 |      49 |  0.00000E+00 |  1.20123E+01 | -1.50000E+01 |  5.68877E-06 | -1.31486E+01 |          1/1
25 |      50 |  0.00000E+00 |  1.07047E+01 | -1.50000E+01 |  3.06274E-06 | -1.32328E+01 |          1/1
Best solution found:
X = [0.99999997 0.99999984 0.99999999 0.99999999 0.99999997 0.99999996
0.9999999  0.99999999 0.99999977 2.99999944 2.99999964 2.99999968
0.99999965]
F = [-14.99999694]
CV=[0.]


## GPSAF-NSGA-II

[8]:

import numpy as np

from pymoo.algorithms.moo.nsga2 import NSGA2
from pymoo.optimize import minimize
from pymoo.problems.multi import ZDT1
from pymoo.visualization.scatter import Scatter
from pysamoo.algorithms.gpsaf import GPSAF

problem = ZDT1(n_var=10)

algorithm = NSGA2(pop_size=20, n_offsprings=10)

algorithm = GPSAF(algorithm,
alpha=10,
beta=50,
n_max_doe=100,
n_max_infills=np.inf,
)

res = minimize(
problem,
algorithm,
('n_evals', 250),
seed=1,
verbose=True)

plot = Scatter()
plot.show()

=========================================================================================================
n_gen |  n_eval |     igd      |      gd      |      hv      | n_influenced |    mae f1    |    mae f2
=========================================================================================================
2 |      21 |  1.894282953 |  3.009591283 |  0.00000E+00 |            - |  3.29840E-16 |  0.164848978
3 |      31 |  0.412635166 |  2.581434100 |  0.204208568 |         2/10 |  2.43659E-16 |  0.149911658
4 |      41 |  0.397959207 |  0.232942755 |  0.204208568 |         1/10 |  2.71623E-16 |  0.110983079
5 |      51 |  0.377478684 |  0.096617472 |  0.319462956 |         3/10 |  2.58404E-16 |  0.080359966
6 |      61 |  0.153897514 |  0.100163899 |  0.415505238 |         3/10 |  2.60278E-16 |  0.071000802
7 |      71 |  0.145908916 |  0.105243022 |  0.433244159 |         4/10 |  2.87754E-16 |  0.086973328
8 |      81 |  0.080839003 |  0.068071184 |  0.552468302 |         6/10 |  3.04136E-16 |  0.088093049
9 |      91 |  0.069593310 |  0.062332489 |  0.568189250 |         5/10 |  6.52689E-16 |  0.088370268
10 |     101 |  0.057377323 |  0.057706437 |  0.578512036 |         8/10 |  8.92850E-16 |  0.094255146
11 |     111 |  0.047998378 |  0.054915196 |  0.584751052 |         6/10 |  8.81770E-16 |  0.074656043
12 |     121 |  0.043389217 |  0.037992011 |  0.591019879 |         7/10 |  1.00372E-15 |  0.071401482
13 |     131 |  0.041928923 |  0.053218105 |  0.597121025 |        10/10 |  9.74394E-16 |  0.060917258
14 |     141 |  0.041584411 |  0.052012254 |  0.598180936 |         7/10 |  6.58766E-16 |  0.053754620
15 |     151 |  0.039302001 |  0.032352983 |  0.601876373 |         4/10 |  4.61768E-16 |  0.055956648
16 |     161 |  0.037368793 |  0.029728139 |  0.605027282 |         5/10 |  4.15984E-16 |  0.052962000
17 |     171 |  0.036161438 |  0.028003904 |  0.608097471 |         5/10 |  4.03878E-16 |  0.040439293
18 |     181 |  0.037280447 |  0.028121721 |  0.606421241 |         6/10 |  3.78968E-16 |  0.040805845
19 |     191 |  0.035306919 |  0.024778587 |  0.608897397 |         8/10 |  6.32584E-16 |  0.037263158
20 |     201 |  0.033917625 |  0.024719113 |  0.610046729 |         6/10 |  7.75075E-16 |  0.027292017
21 |     211 |  0.031875493 |  0.023064972 |  0.614829761 |         7/10 |  9.84907E-16 |  0.028629215
22 |     221 |  0.030696483 |  0.021607031 |  0.616614582 |         5/10 |  1.03511E-15 |  0.027269083
23 |     231 |  0.030268446 |  0.020849562 |  0.617555775 |         4/10 |  1.19047E-15 |  0.021726719
24 |     241 |  0.029651483 |  0.020283724 |  0.618179841 |         7/10 |  9.35363E-16 |  0.017941495
25 |     251 |  0.028997101 |  0.019134586 |  0.619453784 |         8/10 |  9.57914E-16 |  0.019836130

[8]:

<pymoo.visualization.scatter.Scatter at 0x7fb86e34ebb0>


## GPSAF-NSGA-III

[9]:

import numpy as np

from pymoo.algorithms.moo.nsga3 import NSGA3
from pymoo.factory import get_problem, get_reference_directions
from pymoo.optimize import minimize
from pymoo.problems.multi import TNK
from pymoo.visualization.scatter import Scatter
from pysamoo.algorithms.gpsaf import GPSAF

problem = TNK()

ref_dirs = get_reference_directions("das-dennis", 2, n_points=20)

# create the algorithm object
algorithm = NSGA3(pop_size=20,
n_offsprings=10,
ref_dirs=ref_dirs)

algorithm = GPSAF(algorithm,
alpha=10,
beta=50,
n_max_doe=100,
)

res = minimize(
problem,
algorithm,
('n_evals', 200),
seed=1,
verbose=True)

plot = Scatter()
plot.show()

=========================================================================================================
n_gen |  n_eval |   cv (min)   |   cv (avg)   |     igd      |      gd      |      hv      | n_influenced
=========================================================================================================
2 |       5 |  0.285851865 |  3.667450918 |            - |            - |            - |            -
3 |      15 |  0.00000E+00 |  1.257129104 |  0.225786192 |  0.072446443 |  0.147801532 |         5/10
4 |      25 |  0.00000E+00 |  0.025984099 |  0.149069781 |  0.040586016 |  0.148884568 |         8/10
5 |      35 |  0.00000E+00 |  0.00000E+00 |  0.114349341 |  0.059066311 |  0.118412630 |         6/10
6 |      45 |  0.00000E+00 |  0.00000E+00 |  0.112884000 |  0.071252148 |  0.145186502 |         7/10
7 |      55 |  0.00000E+00 |  0.00000E+00 |  0.116813884 |  0.071035440 |  0.149460183 |         6/10
8 |      65 |  0.00000E+00 |  0.00000E+00 |  0.110036245 |  0.066122903 |  0.154135607 |         4/10
9 |      75 |  0.00000E+00 |  0.00000E+00 |  0.079695460 |  0.060042387 |  0.205378897 |         6/10
10 |      85 |  0.00000E+00 |  0.00000E+00 |  0.084115182 |  0.036462115 |  0.197525401 |         6/10
11 |      95 |  0.00000E+00 |  0.00000E+00 |  0.075153945 |  0.057117098 |  0.193915686 |         6/10
12 |     105 |  0.00000E+00 |  0.00000E+00 |  0.073263644 |  0.051969922 |  0.204705373 |         5/10
13 |     115 |  0.00000E+00 |  0.00000E+00 |  0.077497745 |  0.051136372 |  0.190771392 |         8/10
14 |     125 |  0.00000E+00 |  0.00000E+00 |  0.054818633 |  0.024302768 |  0.213750172 |         6/10
15 |     135 |  0.00000E+00 |  0.00000E+00 |  0.057263700 |  0.040403343 |  0.202395732 |         5/10
16 |     145 |  0.00000E+00 |  0.00000E+00 |  0.072791697 |  0.036278619 |  0.191543860 |         5/10
17 |     155 |  0.00000E+00 |  0.00000E+00 |  0.053593653 |  0.033452347 |  0.232890076 |         5/10
18 |     165 |  0.00000E+00 |  0.00000E+00 |  0.064325401 |  0.033864774 |  0.220397000 |         5/10
19 |     175 |  0.00000E+00 |  0.00000E+00 |  0.053553995 |  0.036168487 |  0.224564328 |         7/10
20 |     185 |  0.00000E+00 |  0.00000E+00 |  0.050583172 |  0.034369023 |  0.240656107 |         6/10
21 |     195 |  0.00000E+00 |  0.00000E+00 |  0.049968396 |  0.036422578 |  0.241398332 |         6/10
22 |     205 |  0.00000E+00 |  0.00000E+00 |  0.049716035 |  0.036072472 |  0.243901757 |         3/10

[9]:

<pymoo.visualization.scatter.Scatter at 0x7fb86fa54df0>


# Tools

## Constrained Sampling for Design of Experiments (DOE)

Blank, J., Deb, K. (2021). Constrained Bi-objective Surrogate-Assisted Optimization of Problems with Heterogeneous Evaluation Times: Expensive Objectives and Inexpensive Constraints. In: , et al. Evolutionary Multi-Criterion Optimization. EMO 2021. Lecture Notes in Computer Science, vol 12654. Springer, Cham.

@InProceedings{10.1007/978-3-030-72062-9_21,
author="Blank, Julian
and Deb, Kalyanmoy",
editor="Ishibuchi, Hisao
and Zhang, Qingfu
and Cheng, Ran
and Li, Ke
and Li, Hui
and Wang, Handing
and Zhou, Aimin",
title="Constrained Bi-objective Surrogate-Assisted Optimization of Problems with Heterogeneous Evaluation Times: Expensive Objectives and Inexpensive Constraints",
booktitle="Evolutionary Multi-Criterion Optimization",
year="2021",
publisher="Springer International Publishing",
pages="257--269",
isbn="978-3-030-72062-9"
}


Let us say our goal is to find 50 feasible designs for the SRN problem. We assume that the constraints are computationally inexpensive in contrast to the objectives requiring a time-consuming evaluation.

[4]:

from pymoo.problems.multi import SRN
problem = SRN()

n_points = 50


The function returning the constrained violation (CV) given a design can then be implemented by:

[13]:

def calc_cv(X):
G = problem.evaluate(X, return_values_of=["G"])
return np.maximum(G, 0.0).sum(axis=1)


Now let us define a plot function showing the design space and show infeasible solutions in red and feasible ones in blue:

[14]:

import matplotlib.pyplot as plt

def plot(X):

xl, xu = problem.bounds()

def circle(x=0, y=0, r=1):
theta = np.linspace(0, 2 * np.pi, 100)
return x + r * np.cos(theta), y + r * np.sin(theta)

fig, ax = plt.subplots(figsize=(6, 6))

feas = calc_cv(X) <= 0

ax.scatter(X[feas, 0], X[feas, 1], s=30, facecolors='none', edgecolors='blue')
ax.scatter(X[~feas, 0], X[~feas, 1], s=30, facecolors='none', edgecolors='red')

x, y = circle(r=15)
ax.plot(x, y, color="black", alpha=0.6)

x = np.linspace(-20, 20)
y = 1 / 3 * x + 10 / 3
ax.plot(x, y, color="black", alpha=0.6)

ax.set_aspect(1)

ax.set_xlim(xl[0], xu[0])
ax.set_ylim(xl[1], xu[1])

ax.set_xlabel("$x_1$")
ax.set_ylabel("$x_2$")

plt.show()


If we perform LHS sampling, the result might look as follows:

[20]:

from pymoo.operators.sampling.lhs import LHS

X = LHS().do(problem, n_points).get("X")
plot(X)


Clearly, the majority of designs are infeasible. A simple improvement of this method is repeating the sampling multiple times now until the required number of points has been found.

[21]:

from pysamoo.sampling.rejection import RejectionConstrainedSampling

X = RejectionConstrainedSampling(func_constr).do(problem, n_points).get("X")
plot(X)


By using a more sophisticated energy-based approach the uniformity can be further improved:

[24]:

from pysamoo.sampling.energy import EnergyConstrainedSampling

X = EnergyConstrainedSampling(func_constr).do(problem, n_points).get("X")
plot(X)


# Contact

Julian Blank (blankjul [at] msu.edu)