Multi-objective Optimization: Basics

THIS ARTICLE IS STILL UNDER CONSTRUCTION

This article covers the absolute basics of optimization. Whenever I think about optimization I like to imagine a landscape where our goal is to find one or multiple regions of interest.

Generally, an optimization problem is expressed mathematically the following way:

\begin{align} \begin{split} \min \quad& f(x) \\[4pt] \text{s.t.} \quad& g_{j}(x) \leq 0 \quad \; \; \, \quad j = 1,..,J \\[2pt] \quad& h_{k}(x) = 0 \quad \; \; \quad k = 1,..,K \\[4pt] \quad& x_{i}^{L} \leq x_{i} \leq x_{i}^{U} \quad i = 1,..,N \\[2pt] \end{split} \end{align}

In general, multi-objective optimization has several objective functions with subject to inequality and equality constraints to optimize. The goal is to find a set of solutions that do not have any constraint violation and are as good as possible regarding all its objectives values. The problem definition in its general form is given by:

\begin{align} \begin{split} \min \quad& f_{m}(x) \quad \quad \quad \quad m = 1,..,M \\[4pt] \text{s.t.} \quad& g_{j}(x) \leq 0 \quad \; \; \, \quad j = 1,..,J \\[2pt] \quad& h_{k}(x) = 0 \quad \; \; \quad k = 1,..,K \\[4pt] \quad& x_{i}^{L} \leq x_{i} \leq x_{i}^{U} \quad i = 1,..,N \\[2pt] \end{split} \end{align}

The formulation above defines a multi-objective optimization problem with \(N\) variables, \(M\) objectives, \(J\) inequality and \(K\) equality constraints. Moreover, for each variable \(x_i\) lower and upper variable boundaries (\(x_i^L\) and \(x_i^U\)) are defined.

Example Optimization Problem

In the following, we investigate exemplarily a bi-objective optimization with two constraints. The selection of a suitable optimization problem was made based on having enough complexity for the purpose of demonstration, but not being too difficult to lose track of the overall idea. Its definition is given by:

\begin{align} \begin{split} \min \;\; & f_1(x) = (x_1^2 + x_2^2) \\ \max \;\; & f_2(x) = -(x_1-1)^2 – x_2^2 \\[1mm] \text{s.t.} \;\; & g_1(x) = 2 \, (x_1 – 0.1) \, (x_1 – 0.9) \leq 0\\ & g_2(x) = 20 \, (x_1 – 0.4) \, (x_1 – 0.6) \geq 0\\[1mm] & -2 \leq x_1 \leq 2 \\ & -2 \leq x_2 \leq 2 \end{split} \end{align}

It consists of two objectives (\(M=2\)) where \(f_1(x)\) is minimized and \(f_2(x)\) maximized. The optimization is with subject to two inequality constraints (\(J=2\)) where \(g_1(x)\) is formulated as a less than and \(g_2(x)\) as a greater than constraint. The problem is defined with respect to two variables (\(N=2\)), \(x_1\) and \(x_2\), which both are in the range \([-2,2]\). The problem does not contain any equality constraints (\(K=0\)).

/html/ipynb/moo-basics-01/_images/moo-basics-01_10_0.svg

The figure above shows the contours of the problem. The contour lines of the objective function \(f_1(x)\) is represented by a solid and \(f_2(x)\) by a dashed line. The constraints \(g_1(x)\) and \(g_2(x)\) are parabolas which intersect the \(x_1\)-axis at \((0.1, 0.9)\) and \((0.4, 0.6)\). The pareto-optimal set is illustrated by a thick orange line. Through the combination of both constraints the pareto-set is split into two parts. Analytically, the pareto-optimal set is given by \(PS = \{(x_1, x_2) \,|\, (0.1 \leq x_1 \leq 0.4) \lor (0.6 \leq x_1 \leq 0.9) \, \land \, x_2 = 0\}\) and the Pareto-front by \(f_2 = (\sqrt{f_1} – 1)^2\) where \(f_1\) is defined in \([0.01,0.16]\) and \([0.36,0.81]\).

Problem Definition

In pymoo, we consider pure minimization problems for optimization in all our modules. However, without loss of generality an objective which is supposed to be maximized, can be multiplied by \(-1\) and be minimized. Therefore, we minimize \(-f_2(x)\) instead of maximizing \(f_2(x)\) in our optimization problem. Furthermore, all constraint functions need to be formulated as a \(\leq 0\) constraint. The feasibility of a solution can, therefore, be expressed by:

\[\begin{split} \begin{cases} \text{feasible,} \quad \quad \sum_i^n \langle g_i(x)\rangle = 0\\ \text{infeasbile,} \quad \quad \quad \text{otherwise}\\ \end{cases}\end{split}\]
\[\begin{split}\text{where} \quad \langle g_i(x)\rangle = \begin{cases} 0, \quad \quad \; \text{if} \; g_i(x) \leq 0\\ g_i(x), \quad \text{otherwise}\\ \end{cases}\end{split}\]

For this reason, \(g_2(x)\) needs to be multiplied by \(-1\) in order to flip the \(\geq\) to a \(\leq\) relation. We recommend the normalization of constraints to give equal importance to each of them. For \(g_1(x)\), the coefficient results in \(2 \cdot (-0.1) \cdot (-0.9) = 0.18\) and for \(g_2(x)\) in \(20 \cdot (-0.4) \cdot (-0.6) = 4.8\), respectively. We achieve normalization of constraints by dividing \(g_1(x)\) and \(g_2(x)\) by its corresponding coefficient.

Finally, the optimization problem to be optimized using pymoo is defined by:

\begin{align} \label{eq:getting_started_pymoo} \begin{split} \min \;\; & f_1(x) = (x_1^2 + x_2^2) \\ \min \;\; & f_2(x) = (x_1-1)^2 + x_2^2 \\[1mm] \text{s.t.} \;\; & g_1(x) = 2 \, (x_1 – 0.1) \, (x_1 – 0.9) \, / \, 0.18 \leq 0\\ & g_2(x) = – 20 \, (x_1 – 0.4) \, (x_1 – 0.6) \, / \, 4.8 \leq 0\\[1mm] & -2 \leq x_1 \leq 2 \\ & -2 \leq x_2 \leq 2 \end{split} \end{align}

Next, the derived problem formulation is implemented in Python. Each optimization problem in pymoo has to inherit from the Problem class. First, by calling the super() function the problem properties such as the number of variables n_var, objectives n_obj and constraints n_constr are initialized. Furthermore, lower xl and upper variables boundaries xu are supplied as a NumPy array. Additionally, the evaluation function _evaluate needs to be overwritten from the superclass. The method takes a two-dimensional NumPy array x with n rows and m columns as an input. Each row represents an individual and each column an optimization variable. After doing the necessary calculations, the objective values have to be added to the dictionary out with the key F and the constraints with key G.

Avatar

Julian Blank

Julian Blank is a PhD student in the Department of Computer Science and Engineering at Michigan State University. He received his B.Sc. in Business Information Systems from Otto von Guericke University, Germany in 2010. He was a visiting scholar for six months at the Michigan State University, Michigan, USA in 2015, and, finished his M.Sc. in Computer Science at Otto von Guericke University, Germany in 2016. He is the main developer of pymoo, an open source multi-objective optimization framework in Python. His research interests include evolutionary computation, multi-objective optimization, surrogate-assisted optimization and machine learning.

3 thoughts on “Multi-objective Optimization: Basics

  • Avatar
    November 18, 2020 at 2:54 am
    Permalink

    I really enjoy the blog post. Thanks Again. Really Great. Netta Flint Stockmon

    Reply
  • Avatar
    November 18, 2020 at 9:03 pm
    Permalink

    Well I sincerely enjoyed studying it. This subject offered by you is very constructive for correct planning. Valaree Gustavo Sturrock

    Reply
  • Avatar
    November 19, 2020 at 1:23 pm
    Permalink

    Appreciating the persistence you put into your blog and in depth information you offer. Bonny Cleavland Erin

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *