380 likes | 505 Views
Auto-Tuning Fuzzy Granulation for Evolutionary Optimization. Mohsen Davarynejad, Ferdowsi University of Mashhad, davarynejad@kiaeee.org M.-R.Akbarzadeh-T., Ferdowsi University of Mashhad. akbarzadeh@ieee.org Carlos A. Coello Coello, CINVESTAV-IPN, ccoello@cs.cinvestav.mx.
E N D
Auto-Tuning Fuzzy Granulation for Evolutionary Optimization Mohsen Davarynejad, Ferdowsi University of Mashhad, davarynejad@kiaeee.org M.-R.Akbarzadeh-T., Ferdowsi University of Mashhad. akbarzadeh@ieee.org Carlos A. Coello Coello, CINVESTAV-IPN, ccoello@cs.cinvestav.mx
Outline of the presentation • Brief introduction to evolutionary algorithms • Fitness Approximation Methods • Meta-Models in Fitness Evaluations • Adaptive Fuzzy Fitness Granulation (AFFG) • AFFG with fuzzy supervisory • Numerical Results • Summery and Contributions
Outline of the presentation • Brief introduction to evolutionary algorithms • Fitness Approximation Methods • Meta-Models in Fitness Evaluations • Adaptive Fuzzy Fitness Granulation (AFFG) • AFFG with fuzzy supervisory • Numerical Results • Summery and Contributions
Brief Introduction to Evolutionary Algorithms • Generic structure evolutionary algorithms • Problem representation (encoding) • Selection, recombination and mutation • Fitness evaluations • Evolutionary algorithms • Genetic algorithms • Evolution strategies • Genetic programming … • Pros and cons • Stochastic, global search • No requirement for derivative information • Need large number of fitness evaluations • Not well-suitable for on-line optimization
Motivations(When fitness approximation is necessary?) • No explicit fitness function exists: to define fitness quantitatively • Fitness evaluation is highly time-consuming: to reduce computation time • Fitness is noisy: to cancel out noise • Search for robust solutions: to avoid additional expensive fitness evaluations
Outline of the presentation • Brief introduction to evolutionary algorithms • Fitness Approximation Methods • Meta-Models in Fitness Evaluations • Adaptive Fuzzy Fitness Granulation (AFFG) • AFFG with fuzzy supervisory • Numerical Results • Summery and Contributions
Fitness Approximation Methods • Problem approximation • To replace experiments with simulations • To replace full simulations /models with reduced simulations / models • Ad hoc methods • Fitness inheritance (from parents) • Fitness imitation (from brothers and sisters) • Data-driven functional approximation (Meta-Models) • Polynomials (Response surface methodology) • Neural networks, e.g., multilayer perceptrons (MLPs), RBFN • Support vector machines (SVM)
Fitness Approximation Methods • Problem approximation • To replace experiments with simulations • To replace full simulations /models with reduced simulations / models • Ad hoc methods • Fitness inheritance (from parents) • Fitness imitation (from brothers and sisters) • Data-driven functional approximation (Meta-Models) • Polynomials (Response surface methodology) • Neural networks, e.g., multilayer perceptrons (MLPs), RBFN • Support vector machines (SVM)
Outline of the presentation • Brief introduction to evolutionary algorithms • Fitness Approximation Methods • Meta-Models in Fitness Evaluations • Adaptive Fuzzy Fitness Granulation (AFFG) • AFFG with fuzzy supervisory • Numerical Results • Summery and Contributions
Meta-Model in Fitness Evaluations* • So a combination of meta-model with the original fitness function is necessary • Generation-Based Evolution Control • Individual-Based Evolution Control (random strategy or a best strategy) • Use Meta-Models Only: Risk of False Convergence *Yaochu Jin. “A comprehensive survey of fitness approximation in evolutionary computation”. Soft Computing, 9(1), 3-12, 2005
Generation-Based Evolution Control Generation t Generation t+1 Generation t+2 Generation t+3 Generation t+4
Individual-Based Evolution Control Generation t Generation t+1 Generation t+2 Generation t+3 Generation t+4
Fitness Approximation Methods Problem approximation Meta-Model Ad hoc methods Generation-Based Individual-Based
Fitness Approximation Methods Problem approximation Meta-Model Ad hoc methods Generation-Based Individual-Based ADAPTIVE FUZZY FITNESS GRANULATION is Individual-Based Meta-Model Fitness Approximation Method
GRADUATION AND GRANULATION • The basic concepts of graduation and granulation form the core of FL and are the principal distinguishing features of fuzzy logic. • More specifically, in fuzzy logic everything is or is allowed to be graduated, or equivalently, fuzzy. • Furthermore, in fuzzy logic everything is or is allowed to be granulated, with a granule being a clump of attribute-values drawn together by indistinguishability, similarity, proximity or functionality. • Graduated granulation, or equivalently fuzzy granulation, is a unique feature of fuzzy logic. • Graduated granulation is inspired by the way in which humans deal with complexity and imprecision.* *L. A. Zadeh, www.fuzzieee2007.org/ZadehFUZZ-IEEE2007London.pdf
Outline of the presentation • Brief introduction to evolutionary algorithms • Fitness Approximation Methods • Meta-Models in Fitness Evaluations • Adaptive Fuzzy Fitness Granulation (AFFG) • AFFG with fuzzy supervisory • Numerical Results • Summery and Contributions
ADAPTIVE FUZZY FITNESS GRANULATION (AFFG) Phenospace Fuzzy Similarity Analysis based on Granulation Pool Yes No FF Association FF Evaluation Update Granulation Table Fitness of Individual
Gen:=0 Create initial random population AFFG Part Select a solution Look up fitness from the queue of granules Is the solution similar to an existing granule? Yes Gen:=Gen+1 No Evaluate Fitness of solution exactly Add to the queue as a new granule Update table life function and granule radius No Finished Fitness Evaluation of population? Yes Termination Criterion Satisfied? Yes Designed Results No End Perform Crossover and Mutation Perform Reproduction Fig. 1. Flowchart of the Purposed AFFG Algorithm Flowchart of the Purposed AFFG Algorithm
AFFG (III) • Random parent population is initially created. • Here, • G is a set of fuzzy granules,
AFFG (IV) • Calculate average similarity of a new solution to each granule is:
AFFG (V) Since members of the initial populations generally have less fitness, larger and is smaller, fitness is assumed more often initially by estimation/association to the granules. larger and
AFFG (V) Since members of the initial populations generally have less fitness, larger and is smaller, fitness is assumed more often initially by estimation/association to the granules. larger and
Outline of the presentation • Brief introduction to evolutionary algorithms • Fitness Approximation Methods • Meta-Models in Fitness Evaluations • Adaptive Fuzzy Fitness Granulation (AFFG) • AFFG with fuzzy supervisory • Numerical Results • Summery and Contributions
AFFG - FS • In order to avoid tuning parameters, a fuzzy supervisor as auto- tuning algorithm is employed with three inputs. • Number of Design Variables (NDV) • Maximum Range of DesignVariables (MRDV) • Percentage of Completed Generations (PCG)
The knowledge base of the above architecture has large number of rules and the extraction of these rules is very difficult. Consequently the new architecture is proposed in which the controller is separated in two controllers to diminish the complexity of rules.
AFFG – FS • Granules compete for their survival through a life index. • is initially set at N and subsequently updated as below: • Where M is the life reward of the granule and K is the index of the winning granule for each individual in generation i. Here we set M = 5. • At each table update, only granules with highest index are kept, and others are discarded.
Outline of the presentation • Brief introduction to evolutionary algorithms • Fitness Approximation Methods • Meta-Models in Fitness Evaluations • Adaptive Fuzzy Fitness Granulation (AFFG) • AFFG with fuzzy supervisory • Numerical Results • Summery and Contributions
is considered as a constant parameter and is equal to 0.9 for all simulation results
Effect of on the convergence behavior • only granules with highest index are kept, and others are discarded. • The effect of varying number of granules on the convergence behavior of AFFG and AFFG-FS is studied.
Outline of the presentation • Brief introduction to evolutionary algorithms • Fitness Approximation Methods • Meta-Models in Fitness Evaluations • Adaptive Fuzzy Fitness Granulation (AFFG) • AFFG with fuzzy supervisory • Numerical Results • Summery and Contributions
Summery • Fitness approximation Replaces an accurate, but costly function evaluation with approximate, but cheap function evaluations. • Meta-modeling and other fitness approximation techniques have found a wide range of applications. • Proper control of meta-models plays a critical role in the success of using meta-models. • Small number of individuals in granule pool can still make good results.
Contribution • Why AFFG-FS? • Avoids initial training. • Uses guided association to speed up search process. • Gradually sets up an independent model of initial training data to compensate the lack of sufficient training data and to reach a model with sufficient approximation accuracy. • Avoids the use of model in unrepresented design variable regions in the training set. • In order to avoid tuning of parameters, A fuzzy supervisor as auto-tuning algorithm is being employed. • Features of AFFG-FS: • Exploits design variable space by means of Fuzzy Similarity Analysis (FSA) to avoid premature convergence. • Dynamically updates the association model with minimal overhead cost.