• Hierarchical Genetic Algorithm as a Generic Optimization for Multi-Objective Evolutionary Algorithms

    alternatively

    Hierarchical Genetic Strategy in Multi—Objective Optimization

    genetic algorithms, evolutionary algorithms, multi—objective optimization, multi-criterion decision making, search strategies, Pareto optimality

    Ewa Gajda-Zagórska, Konrad Gądek, Radosław Łazarz, Michał Idzik, Robert Schaefer


    Abstract

    We propose treating Hierarchical Genetic Algorithm[kolodziej2001hierarchical, HGSGeneticSearchReinforced, HGSRealNumber, jojczyk2012global, HGSSimpletons] as a meta—optimizing tool for Multi—objective Evolutionary Algorithms (MOEAs). We argue, that using HGS helps finding better Pareto set approximations and/or helps finding it quicker. We compare some standard multi—objective algorithms[zitzler2004tutorial] against well—known problems[zitzler2000comparison] and then compare results to those obtained while using HGS.

    <script type="text/javascript" src="https://c328740.ssl.cf1.rackcdn.com/mathjax/latest/MathJax.js?config=TeX-AMS_HTML">
    </script>
    <script type="text/javascript">
    Backbone.on(‘set:isComplete’, function() { return MathJax.Hub.Queue([“Typeset”, MathJax.Hub]); });
    Backbone.on(‘card:save’, function(cardId) { return MathJax.Hub.Queue([“Typeset”, MathJax.Hub, “card”+cardId]); });
    </script>


    For references:references.bib on Dropbox

    First review (Schaefer)

  • Introduction and basics

    Intro

    Basics

    other

    First review (Schaefer)

  • Algorithms and problems

    Algorithms

    Problems

    First review (Schaefer)

  • Results and analysis

    Results

    Analysis

  • Summary

  • Introduction

  • Evolutionary Algorithms (EAs) are an invaluable tool providing solutions for complex problems for which no better methods are known or for which even approximate solutions are too complex to use. EAs are not only general, they are also relatively easy to use and implement as well—-in practice, the only requirement is providing the fitness function that indicates how feasible given solution is.

  • Some problem domains consist of multiple, often conflicting, objectives, each to be optimized. One of the popular methods is to implement some function that steers the EAs in some particular direction by combining results for all objectives into one value. Unfortunately, such search strategies’ results give little information on a problem nor any other available, probably more promising, solutions with different balances. This is why multi—objective evolutionary algorithms (MOEAs) are being used. MOEAs work by maintaining and returning a set of promising solutions, each with different balance between objectives. The result of such search may give us some more insight into the problem. Moreover, given results of MOEAs, it is trivial to find (nearly) optimal solutions for some desirable balance between objectives.

  • Having such requirements, MOEAs are somewhat more complicated. Beside trying to optimize a set of solutions, they need to maintain diversity of that set so the results would be meaningful and accurate[zitzler2004indicator]. There is even more to consider, e.g. what happens if some results yield exactly same objective values as others while not being equal [preuss2006pareto, rudolph2007capabilities], how to specify the trade—off between minimizing a distance to Pareto—optimal set and maximizing diversity of a set [zitzler2004indicator] or how to deal with NP—hard problem of calculating hypervolume indicator [bradstreet2011hypervolume].

  • Multiple algorithms exists, which try to solve same problems in different way. They tend to yield different performance characteristics, thus multiple comparisons and analysis are available [zitzler2000comparison, zitzler1999multiobjective, laumanns2002running].

  • Our research was to look again at popular algorithms and verify their performance. We implemented them so to allow Hierarchical Genetic Algorithm to work cooperatively with those so to perform analysis on whether this would yield differences in performance. In this paper, we summarize our work and give brief conclusions.

  • Basics

  • Let’s consider parameter space\[ \vec{x} = (x_1, x_2, \ldots, x_m) \in \mathbf{X}\subset\mathbb{R}^m \] and objective space\[ \vec{y} = (y_1, y_2, \ldots, y_n) \in \mathbf{Y}\subset\mathbb{R}^n \]Given some fitness function\[
    f\colon \mathbf{X}\to\mathbf{Y}
    \]we want to find those vectors in parameter space that minimize all and each of the components of objective vector.

  • Dominance is a relation of strict partial order between two objective vectors. We say that \(\vec{y^{(1)}}\) dominates \(\vec{y^{(2)}}\) if every component of former vector is at least as good as the corresponding component of the latter and at least one of the components is better. More formally:\[
    \vec{y^{(1)}}\prec\vec{y^{(2)}} \stackrel{\mathrm{def}}{\iff} \left(\forall i \in [1, 2, \ldots, m]\colon \vec{y^{(1)}_i} \leq \vec{y^{(2)}_i}\right)\\
    \land
    \left(\exists i \in [1, 2, \ldots, m]\colon \vec{y^{(1)}_i} < \vec{y^{(2)}_i}\right)
    \]
    We can extend this relation to parameter space—-parameter vector dominates another if corresponding objective vector for former dominates the objective vector for the latter:\[
    \vec{x^{(1)}}\prec\vec{x^{(2)}}\iff f(\vec{x^{(1)}}) \prec f(\vec{x^{(2)}})
    \]

  • Solution is optimal, iff no component of objective vector could be further minimized without increasing the value of other component(s). Equivalently, solution is optimal if there is no other solution that dominate it. The set of non—dominated decision vectors is called a Pareto set and the set of non—dominated objective vectors is called a Pareto front.

  • The MOEA is an algorithm that is given the fitness/objective function \(f\), a description of its domain \(\mathbf{X}\) and its co-domain\(\mathbf{Y}\), returns an approximation of Pareto—set according to the results of that function.

  • Algorithms

  • NSGA-II[deb2000fast] is a much improved version of Non—dominated Sorting Genetic Algorithm (NSGA).


    TO BE EXTENDED

  • SPEA2 stands for Strength Pareto Evolutionary Algorithm[zitzler2001spea2] and is an improved version of SPEA. Its main concept is to keep a non—dominated archive of constant size throughout whole evolutionary truncating solutions that are too close to each other. Analysis shows that it’s a significant improvement over predecessor and it’s comparable in performance to NSGA-II.

  • Indicator—Based Evolutionary Algorithm[zitzler2004indicator] was created because contemporary algorithms had made implicit assumptions about trade—off between minimizing distance to Pareto—optimal set and maximizing diversity of optimized set. IBEA introduces customizable indicator as a performance goal.

  • HGS


    TO BE EXTENDED

  • Problems

  • Move some refs from abstract here.

  • Relation \(\prec\) is:

    • anti—reflexive
    • anti—symmetric
    • asymmetric
    • transitive
    • strict partial order
  • Take a look on the difference: SPEA2 but NSGA-II (arabic vs roman numbers).

  • Shall we describe binary indicator?
    \[
    I\colon \Omega \times \Omega \to \mathbb{R}
    \]

    \[
    I{\epsilon^+} = min\epsilon \left{
    \forall x^2 \in B \; \exists x^1 \in A\colon f_i (x^1) - \epsilon \leq f_i (x^2)\ \mathrm{for}\ i\in {1, 2, \ldots, n}
    \right}
    \]

    Binary quality indicator is dominance preserving iff
    \[
    x^1 \prec x^2 \Rightarrow I(\{x^1\}, \{x^2\}) < I(\{x^2\}, \{x^1\})
    \]and\[
    x^1 \prec x^2 \Rightarrow \forall x^3 \in X\colon I(\{x^3\}, \{x^1\}) \geq I(\{x^3\}, \{x^2\})
    \]

{"cards":[{"_id":"38a4e5961258bb46e600004b","treeId":"38a3954c1258bb46e600001f","seq":1,"position":1,"parentId":null,"content":"# Hierarchical Genetic Algorithm as a Generic Optimization for Multi-Objective Evolutionary Algorithms\nalternatively\n# Hierarchical Genetic Strategy in Multi--Objective Optimization\n### genetic algorithms, evolutionary algorithms, multi--objective optimization, multi-criterion decision making, search strategies, Pareto optimality\n*Ewa Gajda-Zagórska, Konrad Gądek, Radosław Łazarz, Michał Idzik, Robert Schaefer*\n\n---\n### Abstract\n\nWe propose treating Hierarchical Genetic Algorithm`[kolodziej2001hierarchical,` `HGSGeneticSearchReinforced,` `HGSRealNumber,` `jojczyk2012global,` `HGSSimpletons]` as a meta--optimizing tool for Multi--objective Evolutionary Algorithms (MOEAs). We argue, that using HGS helps finding better Pareto set approximations and/or helps finding it quicker. We compare some standard multi--objective algorithms`[zitzler2004tutorial]` against well--known problems`[zitzler2000comparison]` and then compare results to those obtained while using HGS.\n\n<script type=\"text/javascript\" src=\"https://c328740.ssl.cf1.rackcdn.com/mathjax/latest/MathJax.js?config=TeX-AMS_HTML\">\n</script>\n<script type=\"text/javascript\">\nBackbone.on('set:isComplete', function() { return MathJax.Hub.Queue([\"Typeset\", MathJax.Hub]); });\nBackbone.on('card:save', function(cardId) { return MathJax.Hub.Queue([\"Typeset\", MathJax.Hub, \"card\"+cardId]); });\n</script>\n\n---\n##### For references:`references.bib` on Dropbox\n\n---\n#### First review (Schaefer)\n1. [ ] Motivation\n2. [ ] Intro, state of the art\n3. [ ] HGS and history\n4. [ ] Test plan\n * [ ] test descriptions\n * [ ] metrics description (what do they measure?)\n * [ ] competitors description\n5. [ ] Results & analysis\n6. [ ] Conclusions"},{"_id":"38a4f8611258bb46e600004d","treeId":"38a3954c1258bb46e600001f","seq":1,"position":1,"parentId":"38a4e5961258bb46e600004b","content":"## Introduction and basics\n\n#### Intro\n[X] --\n\n#### Basics\n[X] Define multi--objective problems\n[X] Define Pareto set/Pareto front\n[X] Mention that many comparisons are available\n\n#### other\n[ ] Verify that no paper on *HGS as metaoptimizer* exists—so far none found \n[X] EMOA vs MOEA\n[ ] Shall we share Python sources?\n\n#### First review (Schaefer)\n[ ] objective #1: compare HGS w/ multimodal select against other EAs\n[ ] objective #2: analysis of used mechanisms\n[ ] motivation: why we think HGS+* could be better, when, why it is (if it's the case); what are the benefits"},{"_id":"38a51af61258bb46e6000051","treeId":"38a3954c1258bb46e600001f","seq":1,"position":1,"parentId":"38a4f8611258bb46e600004d","content":"# Introduction"},{"_id":"38a599fc1258bb46e6000054","treeId":"38a3954c1258bb46e600001f","seq":1,"position":1.5,"parentId":"38a4f8611258bb46e600004d","content":"Evolutionary Algorithms (EAs) are an invaluable tool providing solutions for complex problems for which no better methods are known or for which even approximate solutions are too complex to use. EAs are not only general, they are also relatively easy to use and implement as well---in practice, the only requirement is providing the *fitness function* that indicates how feasible given solution is."},{"_id":"38a5ce051258bb46e6000056","treeId":"38a3954c1258bb46e600001f","seq":1,"position":1,"parentId":"38a599fc1258bb46e6000054","content":"[ ] V. brief history of EAs?\n[ ] Some historical references"},{"_id":"38a5cfa71258bb46e6000058","treeId":"38a3954c1258bb46e600001f","seq":1,"position":1.75,"parentId":"38a4f8611258bb46e600004d","content":"Some problem domains consist of multiple, often conflicting, objectives, each to be optimized. One of the popular methods is to implement some function that steers the EAs in some particular direction by combining results for all objectives into one value. Unfortunately, such search strategies' results give little information on a problem nor any other available, probably more promising, solutions with different balances. This is why multi--objective evolutionary algorithms (MOEAs) are being used. MOEAs work by maintaining and returning *a set* of promising solutions, each with different balance between objectives. The result of such search may give us some more insight into the problem. Moreover, given results of MOEAs, it is trivial to find (nearly) optimal solutions for some desirable balance between objectives."},{"_id":"38a5e5c21258bb46e600005b","treeId":"38a3954c1258bb46e600001f","seq":1,"position":1.875,"parentId":"38a4f8611258bb46e600004d","content":"Having such requirements, MOEAs are somewhat more complicated. Beside trying to optimize a set of solutions, they need to maintain diversity of that set so the results would be meaningful and accurate`[zitzler2004indicator]`. There is even more to consider, e.g. what happens if some results yield exactly same objective values as others while not being equal `[preuss2006pareto,` `rudolph2007capabilities]`, how to specify the trade--off between minimizing a distance to Pareto--optimal set and maximizing diversity of a set `[zitzler2004indicator]` or how to deal with NP--hard problem of calculating hypervolume indicator `[bradstreet2011hypervolume]`."},{"_id":"38a6bda01258bb46e600006a","treeId":"38a3954c1258bb46e600001f","seq":1,"position":1,"parentId":"38a5e5c21258bb46e600005b","content":"[ ] Mentioning about hypervolume: good, bad or ugly?\n[ ] @K: reading & analysis of `bradstreet2011hypervolume`\n[ ] from `laumanns2002running` (LOTZ) or from `zitzler2001spea2` (KP-750-m) get discrete problem(s) for comparison"},{"_id":"38ad039f7152f33cf3000021","treeId":"38a3954c1258bb46e600001f","seq":1,"position":1.9375,"parentId":"38a4f8611258bb46e600004d","content":"Multiple algorithms exists, which try to solve same problems in different way. They tend to yield different performance characteristics, thus multiple comparisons and analysis are available `[zitzler2000comparison,` `zitzler1999multiobjective,` `laumanns2002running]`."},{"_id":"38ad08f87152f33cf3000022","treeId":"38a3954c1258bb46e600001f","seq":1,"position":1.96875,"parentId":"38a4f8611258bb46e600004d","content":"Our research was to look again at popular algorithms and verify their performance. We implemented them so to allow Hierarchical Genetic Algorithm to work cooperatively with those so to perform analysis on whether this would yield differences in performance. In this paper, we summarize our work and give brief conclusions."},{"_id":"38ad0d7d7152f33cf3000023","treeId":"38a3954c1258bb46e600001f","seq":1,"position":1,"parentId":"38ad08f87152f33cf3000022","content":"Move some refs from abstract here."},{"_id":"38a51c811258bb46e6000052","treeId":"38a3954c1258bb46e600001f","seq":1,"position":2,"parentId":"38a4f8611258bb46e600004d","content":"# Basics"},{"_id":"38a5f8311258bb46e600005c","treeId":"38a3954c1258bb46e600001f","seq":1,"position":3,"parentId":"38a4f8611258bb46e600004d","content":"Let's consider parameter space\\\\[ \\vec{x} = (x_1, x_2, \\ldots, x_m) \\in \\mathbf{X}\\subset\\mathbb{R}^m \\\\] and objective space\\\\[ \\vec{y} = (y_1, y_2, \\ldots, y_n) \\in \\mathbf{Y}\\subset\\mathbb{R}^n \\\\]Given some fitness function\\\\[\nf\\colon \\mathbf{X}\\to\\mathbf{Y}\n\\\\]we want to find those vectors in parameter space that *minimize* all and each of the components of objective vector."},{"_id":"38a616551258bb46e600005d","treeId":"38a3954c1258bb46e600001f","seq":1,"position":4,"parentId":"38a4f8611258bb46e600004d","content":"Dominance is a relation of strict partial order between two objective vectors. We say that \\\\(\\vec{y^{(1)}}\\\\) *dominates* \\\\(\\vec{y^{(2)}}\\\\) if every component of former vector is at least as good as the corresponding component of the latter and at least one of the components is better. More formally:\\\\[\n\\vec{y^{(1)}}\\prec\\vec{y^{(2)}} \\stackrel{\\mathrm{def}}{\\iff} \\left(\\forall i \\in [1, 2, \\ldots, m]\\colon \\vec{y^{(1)}_i} \\leq \\vec{y^{(2)}_i}\\right)\\\\\\\\\n\\land\n\\left(\\exists i \\in [1, 2, \\ldots, m]\\colon \\vec{y^{(1)}_i} < \\vec{y^{(2)}_i}\\right)\n\\\\]\nWe can extend this relation to parameter space---parameter vector dominates another if corresponding objective vector for former dominates the objective vector for the latter:\\\\[\n\\vec{x^{(1)}}\\prec\\vec{x^{(2)}}\\iff f(\\vec{x^{(1)}}) \\prec f(\\vec{x^{(2)}})\n\\\\]"},{"_id":"38a62b7c1258bb46e600005e","treeId":"38a3954c1258bb46e600001f","seq":1,"position":1,"parentId":"38a616551258bb46e600005d","content":"Relation \\\\(\\prec\\\\) is:\n* anti--reflexive\n* anti--symmetric\n* asymmetric\n* transitive\n* strict partial order"},{"_id":"38a6488c1258bb46e6000060","treeId":"38a3954c1258bb46e600001f","seq":1,"position":5,"parentId":"38a4f8611258bb46e600004d","content":"Solution is optimal, iff no component of objective vector could be further minimized without increasing the value of other component(s). Equivalently, solution is optimal if there is no other solution that dominate it. The set of non--dominated decision vectors is called a Pareto set and the set of non--dominated objective vectors is called a Pareto front."},{"_id":"38a65a951258bb46e6000061","treeId":"38a3954c1258bb46e600001f","seq":1,"position":6,"parentId":"38a4f8611258bb46e600004d","content":"The MOEA is an algorithm that is given the fitness/objective function \\\\(f\\\\), a description of its domain \\\\(\\mathbf{X}\\\\) and its co-domain\\\\(\\mathbf{Y}\\\\), returns an approximation of Pareto--set according to the results of that function."},{"_id":"38a6974e1258bb46e6000069","treeId":"38a3954c1258bb46e600001f","seq":1,"position":1,"parentId":"38a65a951258bb46e6000061","content":"[ ] Is this an accurate definition?"},{"_id":"38a664831258bb46e6000062","treeId":"38a3954c1258bb46e600001f","seq":1,"position":2,"parentId":"38a4e5961258bb46e600004b","content":"## Algorithms and problems\n\n#### Algorithms\n[X] SPEA2\n[ ] NSGA-II\n[X] IBEA\n[ ] HGS\n\n#### Problems\n[ ] Ackley\n[ ] COEMOA A-E\n[ ] Kursawe\n[ ] *LOTZ* ?\n[ ] *KP-750-m* ?\n\n#### First review (Schaefer)\n[ ] what are test groups, what do they show / which attributes do they expose?\n[ ] metrics! @Radek\n[ ] finalize benchmarks: run 10x, average\n[ ] comparison: stop criterion by budget & by accuracy"},{"_id":"38a667fd1258bb46e6000063","treeId":"38a3954c1258bb46e600001f","seq":1,"position":1,"parentId":"38a664831258bb46e6000062","content":"# Algorithms"},{"_id":"38ae159c7152f33cf3000029","treeId":"38a3954c1258bb46e600001f","seq":1,"position":1.25,"parentId":"38a664831258bb46e6000062","content":"NSGA-II`[deb2000fast]` is a much improved version of Non--dominated Sorting Genetic Algorithm (NSGA).\n\n---\nTO BE EXTENDED"},{"_id":"38acf0b07152f33cf300001e","treeId":"38a3954c1258bb46e600001f","seq":1,"position":1.5,"parentId":"38ae159c7152f33cf3000029","content":"Take a look on the difference: SPEA2 **but** NSGA-II (arabic vs roman numbers)."},{"_id":"38a691bf1258bb46e6000068","treeId":"38a3954c1258bb46e600001f","seq":1,"position":1.5,"parentId":"38a664831258bb46e6000062","content":"SPEA2 stands for Strength Pareto Evolutionary Algorithm`[zitzler2001spea2]` and is an improved version of SPEA. Its main concept is to keep a non--dominated archive of constant size throughout whole evolutionary truncating solutions that are too close to each other. Analysis shows that it's a significant improvement over predecessor and it's comparable in performance to NSGA-II."},{"_id":"38a668bd1258bb46e6000065","treeId":"38a3954c1258bb46e600001f","seq":1,"position":3,"parentId":"38a664831258bb46e6000062","content":"Indicator--Based Evolutionary Algorithm`[zitzler2004indicator]` was created because contemporary algorithms had made implicit assumptions about trade--off between minimizing distance to Pareto--optimal set and maximizing diversity of optimized set. IBEA introduces customizable indicator as a performance goal."},{"_id":"38ae4c857152f33cf300002a","treeId":"38a3954c1258bb46e600001f","seq":1,"position":1,"parentId":"38a668bd1258bb46e6000065","content":"Shall we describe binary indicator?\n\\\\[\nI\\colon \\Omega \\times \\Omega \\to \\mathbb{R}\n\\\\]\n\n\\\\[\nI_{\\epsilon^+} = min_\\epsilon \\left\\{\n\\forall x^2 \\in B \\; \\exists x^1 \\in A\\colon f_i (x^1) - \\epsilon \\leq f_i (x^2)\\ \\mathrm{for}\\ i\\in \\{1, 2, \\ldots, n\\}\n\\right\\}\n\\\\]\n\nBinary quality indicator is dominance preserving iff\n\\\\[\nx^1 \\prec x^2 \\Rightarrow I(\\\\{x^1\\\\}, \\\\{x^2\\\\}) < I(\\\\{x^2\\\\}, \\\\{x^1\\\\})\n\\\\]and\\\\[\nx^1 \\prec x^2 \\Rightarrow \\forall x^3 \\in X\\colon I(\\\\{x^3\\\\}, \\\\{x^1\\\\}) \\geq I(\\\\{x^3\\\\}, \\\\{x^2\\\\})\n\\\\]"},{"_id":"38a668de1258bb46e6000066","treeId":"38a3954c1258bb46e600001f","seq":1,"position":4,"parentId":"38a664831258bb46e6000062","content":"HGS\n\n---\nTO BE EXTENDED"},{"_id":"38aea2a27152f33cf300002d","treeId":"38a3954c1258bb46e600001f","seq":1,"position":5,"parentId":"38a664831258bb46e6000062","content":"# Problems"},{"_id":"38ad9a9a7152f33cf3000027","treeId":"38a3954c1258bb46e600001f","seq":1,"position":3,"parentId":"38a4e5961258bb46e600004b","content":"## Results and analysis\n\n#### Results\n[ ] --\n\n#### Analysis\n[ ] --"},{"_id":"38ad9c367152f33cf3000028","treeId":"38a3954c1258bb46e600001f","seq":1,"position":4,"parentId":"38a4e5961258bb46e600004b","content":"## Summary\n\n[ ] --"}],"tree":{"_id":"38a3954c1258bb46e600001f","name":"EvoGIL II","publicUrl":"evo-gil2"}}