1 / 37

Auto-Tuning Fuzzy Granulation for Evolutionary Optimization

Auto-Tuning Fuzzy Granulation for Evolutionary Optimization. Mohsen Davarynejad, Ferdowsi University of Mashhad, davarynejad@kiaeee.org M.-R.Akbarzadeh-T., Ferdowsi University of Mashhad. akbarzadeh@ieee.org Carlos A. Coello Coello, CINVESTAV-IPN, ccoello@cs.cinvestav.mx.

jovan
Download Presentation

Auto-Tuning Fuzzy Granulation for Evolutionary Optimization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Auto-Tuning Fuzzy Granulation for Evolutionary Optimization Mohsen Davarynejad, Ferdowsi University of Mashhad, davarynejad@kiaeee.org M.-R.Akbarzadeh-T., Ferdowsi University of Mashhad. akbarzadeh@ieee.org Carlos A. Coello Coello, CINVESTAV-IPN, ccoello@cs.cinvestav.mx

  2. Outline of the presentation • Brief introduction to evolutionary algorithms • Fitness Approximation Methods • Meta-Models in Fitness Evaluations • Adaptive Fuzzy Fitness Granulation (AFFG) • AFFG with fuzzy supervisory • Numerical Results • Summery and Contributions

  3. Outline of the presentation • Brief introduction to evolutionary algorithms • Fitness Approximation Methods • Meta-Models in Fitness Evaluations • Adaptive Fuzzy Fitness Granulation (AFFG) • AFFG with fuzzy supervisory • Numerical Results • Summery and Contributions

  4. Brief Introduction to Evolutionary Algorithms • Generic structure evolutionary algorithms • Problem representation (encoding) • Selection, recombination and mutation • Fitness evaluations • Evolutionary algorithms • Genetic algorithms • Evolution strategies • Genetic programming … • Pros and cons • Stochastic, global search • No requirement for derivative information • Need large number of fitness evaluations • Not well-suitable for on-line optimization

  5. Motivations(When fitness approximation is necessary?) • No explicit fitness function exists: to define fitness quantitatively • Fitness evaluation is highly time-consuming: to reduce computation time • Fitness is noisy: to cancel out noise • Search for robust solutions: to avoid additional expensive fitness evaluations

  6. Outline of the presentation • Brief introduction to evolutionary algorithms • Fitness Approximation Methods • Meta-Models in Fitness Evaluations • Adaptive Fuzzy Fitness Granulation (AFFG) • AFFG with fuzzy supervisory • Numerical Results • Summery and Contributions

  7. Fitness Approximation Methods • Problem approximation • To replace experiments with simulations • To replace full simulations /models with reduced simulations / models • Ad hoc methods • Fitness inheritance (from parents) • Fitness imitation (from brothers and sisters) • Data-driven functional approximation (Meta-Models) • Polynomials (Response surface methodology) • Neural networks, e.g., multilayer perceptrons (MLPs), RBFN • Support vector machines (SVM)

  8. Fitness Approximation Methods • Problem approximation • To replace experiments with simulations • To replace full simulations /models with reduced simulations / models • Ad hoc methods • Fitness inheritance (from parents) • Fitness imitation (from brothers and sisters) • Data-driven functional approximation (Meta-Models) • Polynomials (Response surface methodology) • Neural networks, e.g., multilayer perceptrons (MLPs), RBFN • Support vector machines (SVM)

  9. Outline of the presentation • Brief introduction to evolutionary algorithms • Fitness Approximation Methods • Meta-Models in Fitness Evaluations • Adaptive Fuzzy Fitness Granulation (AFFG) • AFFG with fuzzy supervisory • Numerical Results • Summery and Contributions

  10. Meta-Model in Fitness Evaluations* • So a combination of meta-model with the original fitness function is necessary • Generation-Based Evolution Control • Individual-Based Evolution Control (random strategy or a best strategy) • Use Meta-Models Only: Risk of False Convergence *Yaochu Jin. “A comprehensive survey of fitness approximation in evolutionary computation”. Soft Computing, 9(1), 3-12, 2005

  11. Generation-Based Evolution Control Generation t Generation t+1 Generation t+2 Generation t+3 Generation t+4

  12. Individual-Based Evolution Control Generation t Generation t+1 Generation t+2 Generation t+3 Generation t+4

  13. Fitness Approximation Methods Problem approximation Meta-Model Ad hoc methods Generation-Based Individual-Based

  14. Fitness Approximation Methods Problem approximation Meta-Model Ad hoc methods Generation-Based Individual-Based ADAPTIVE FUZZY FITNESS GRANULATION is Individual-Based Meta-Model Fitness Approximation Method

  15. GRADUATION AND GRANULATION • The basic concepts of graduation and granulation form the core of FL and are the principal distinguishing features of fuzzy logic. • More specifically, in fuzzy logic everything is or is allowed to be graduated, or equivalently, fuzzy. • Furthermore, in fuzzy logic everything is or is allowed to be granulated, with a granule being a clump of attribute-values drawn together by indistinguishability, similarity, proximity or functionality. • Graduated granulation, or equivalently fuzzy granulation, is a unique feature of fuzzy logic. • Graduated granulation is inspired by the way in which humans deal with complexity and imprecision.* *L. A. Zadeh, www.fuzzieee2007.org/ZadehFUZZ-IEEE2007London.pdf

  16. Outline of the presentation • Brief introduction to evolutionary algorithms • Fitness Approximation Methods • Meta-Models in Fitness Evaluations • Adaptive Fuzzy Fitness Granulation (AFFG) • AFFG with fuzzy supervisory • Numerical Results • Summery and Contributions

  17. ADAPTIVE FUZZY FITNESS GRANULATION (AFFG) Phenospace Fuzzy Similarity Analysis based on Granulation Pool Yes No FF Association FF Evaluation Update Granulation Table Fitness of Individual

  18. Gen:=0 Create initial random population AFFG Part Select a solution Look up fitness from the queue of granules Is the solution similar to an existing granule? Yes Gen:=Gen+1 No Evaluate Fitness of solution exactly Add to the queue as a new granule Update table life function and granule radius No Finished Fitness Evaluation of population? Yes Termination Criterion Satisfied? Yes Designed Results No End Perform Crossover and Mutation Perform Reproduction Fig. 1. Flowchart of the Purposed AFFG Algorithm Flowchart of the Purposed AFFG Algorithm

  19. AFFG (III) • Random parent population is initially created. • Here, • G is a set of fuzzy granules,

  20. AFFG (IV) • Calculate average similarity of a new solution to each granule is:

  21. AFFG (V) Since members of the initial populations generally have less fitness, larger and is smaller, fitness is assumed more often initially by estimation/association to the granules. larger and

  22. AFFG (V) Since members of the initial populations generally have less fitness, larger and is smaller, fitness is assumed more often initially by estimation/association to the granules. larger and

  23. Outline of the presentation • Brief introduction to evolutionary algorithms • Fitness Approximation Methods • Meta-Models in Fitness Evaluations • Adaptive Fuzzy Fitness Granulation (AFFG) • AFFG with fuzzy supervisory • Numerical Results • Summery and Contributions

  24. AFFG - FS • In order to avoid tuning parameters, a fuzzy supervisor as auto- tuning algorithm is employed with three inputs. • Number of Design Variables (NDV) • Maximum Range of DesignVariables (MRDV) • Percentage of Completed Generations (PCG)

  25. The knowledge base of the above architecture has large number of rules and the extraction of these rules is very difficult. Consequently the new architecture is proposed in which the controller is separated in two controllers to diminish the complexity of rules.

  26. AFFG – FS • Granules compete for their survival through a life index. • is initially set at N and subsequently updated as below: • Where M is the life reward of the granule and K is the index of the winning granule for each individual in generation i. Here we set M = 5. • At each table update, only granules with highest index are kept, and others are discarded.

  27. Outline of the presentation • Brief introduction to evolutionary algorithms • Fitness Approximation Methods • Meta-Models in Fitness Evaluations • Adaptive Fuzzy Fitness Granulation (AFFG) • AFFG with fuzzy supervisory • Numerical Results • Summery and Contributions

  28. is considered as a constant parameter and is equal to 0.9 for all simulation results

  29. Effect of on the convergence behavior • only granules with highest index are kept, and others are discarded. • The effect of varying number of granules on the convergence behavior of AFFG and AFFG-FS is studied.

  30. Outline of the presentation • Brief introduction to evolutionary algorithms • Fitness Approximation Methods • Meta-Models in Fitness Evaluations • Adaptive Fuzzy Fitness Granulation (AFFG) • AFFG with fuzzy supervisory • Numerical Results • Summery and Contributions

  31. Summery • Fitness approximation Replaces an accurate, but costly function evaluation with approximate, but cheap function evaluations. • Meta-modeling and other fitness approximation techniques have found a wide range of applications. • Proper control of meta-models plays a critical role in the success of using meta-models. • Small number of individuals in granule pool can still make good results.

  32. Contribution • Why AFFG-FS? • Avoids initial training. • Uses guided association to speed up search process. • Gradually sets up an independent model of initial training data to compensate the lack of sufficient training data and to reach a model with sufficient approximation accuracy. • Avoids the use of model in unrepresented design variable regions in the training set. • In order to avoid tuning of parameters, A fuzzy supervisor as auto-tuning algorithm is being employed. • Features of AFFG-FS: • Exploits design variable space by means of Fuzzy Similarity Analysis (FSA) to avoid premature convergence. • Dynamically updates the association model with minimal overhead cost.

More Related