1 / 32

Grashof Mechanism Synthesis Using Multi-Objective Parallel Asynchronous Particle Swarm Optimization

Grashof Mechanism Synthesis Using Multi-Objective Parallel Asynchronous Particle Swarm Optimization. Robin McDougall Scott Nokleby Mechatronic and Robotic Systems Laboratory. Outline. Context and Background Multi-Objective PSO (MOPSO) via Pareto Dominance Parallelization of PSO (PAPSO)

cybil
Download Presentation

Grashof Mechanism Synthesis Using Multi-Objective Parallel Asynchronous Particle Swarm Optimization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Grashof Mechanism Synthesis Using Multi-Objective Parallel Asynchronous Particle Swarm Optimization Robin McDougall Scott Nokleby Mechatronic and Robotic Systems Laboratory

  2. Outline • Context and Background • Multi-Objective PSO (MOPSO) via Pareto Dominance • Parallelization of PSO (PAPSO) • MOPAPSO • Results

  3. Introduction • Increasing role for global optimization techniques in engineering design • Ambition in design leads to more highly-parameterized systems • More parameters lead to increasingly non-linear objective function surfaces

  4. Introduction • Particle Swarm Optimization (PSO) gaining increasing attention in both research and applications • Over time a number of variants have been utilized with great success • Many on the “Particle” level: • Particle Accelerations • Transient Social and Personal Weights • Dynamic Forms • … and many more

  5. Motivation • Optimization-Based Mechanism Synthesis (OBMS) • Highly parameterized system • Use optimization techniques to choose parameters • More parameters typically lead to more non-linear objective function surfaces • Effects of which can confoundtraditional optimization techniques

  6. Motivation • Replace previously used deterministic techniques with a global optimization technique • No need for parameter transforms • No need for “pseudo-global” techniques • Prevent artificial constriction of search space

  7. Motivation • A typical OBMS objective could be to design a system to follow a given path…

  8. Motivation • OBMS with PSO synthesized mechanisms which could fulfill this task better than deterministic algorithms

  9. Motivation • With one major caveat…. • PSO took hours instead of minutes

  10. Objectives • Use Multi-Objective PSO (MOPSO) to handle multi-objective problem specifications • Use Parallel Asynchronous PSO (PAPSO) to speed things up • Both topics well covered in the literature individually • Little mention of combining the two

  11. Multi-Objective Optimization • Engineering design choices often involve balancing competitive objectives: • Cost vs. Performance • Size vs. Strength • Effectiveness vs. Efficiency • What options are available to us to deal with these competing objectives?

  12. Multi-Objective Optimization • Could use a weighting scheme • Concerns: • With no prior knowledge, how do you select the weights? • Potential to unfairly influence optimization before execution begins

  13. Multi-Objective Optimization • What we would like to do is change: to • Shift the decision on when to decide how influential each objective will be to after the optimization effort instead of before hand

  14. Multi-Objective Optimization • MOPSO uses Pareto Dominance to determine set of solutions for one or more competing objectives • Each point in the optimal set constitutes a non-dominated solution • Two objective function systems form a front, more, a hyper-surface

  15. Multi-Objective Optimization • Imagine a two objective function system: • Instead of a single optimum solution, MOPSO delivers a front of non-dominated solutions • Each point on the front represents the best possible solution for a given objective function with respect to the other objective functions

  16. MOPSO • MOPSO requires two significant changes to the basic form of PSO: • Creation and active maintenance of a repository to collect the non-dominated candidate solutions • Modification of the basic form of the velocity equation to choose a social leader form this repository instead of of a global best

  17. MOPSO • No longer a single social leader (SL) available in MOPSO • Instead, need to choose a particle from the repository to serve as the SL • Use weighted Roulette Wheel procedure to select SL • Biased towards sparsely populated regions of the emerging front

  18. PAPSO • Reduce runtime by performing swarm activities simultaneously • PSO lends itself well to parallelization • Fitness, velocity, position updates independent per swarm • Processors: • Master Processor to administrate swarm • Slave Processors perform Objective Function Evaluations and Particle Updates

  19. PAPSO Notes: • If (# Particles > Number of Processors) • FIFO queue for particles • Asynchronous nature mitigates negative performance effects caused by runtime variability • Runtime improvement proportional to ratio of Objective Function Calculation time to Network Transmission Time

  20. MOPAPSO • The idea is to combine these variants: • MOPSO to provide formal multi-objective support • PAPSO to speed things up • Requirements: • Should match MOPSO results • Should reduce overall runtime

  21. MOPAPSO Two Roles for Processors… One Master N# of Slaves Initializes the swarm Catch GBEST, PPOS, PVEL Creates a FIFO particle queue Update Velocity Dispatches the first “N” jobs Calculate OFs for each Object Return OFs, PPOS and PVEL Catch updated particle specs. Dispatch next particle job Update the repository Every “m” iterations

  22. MOPAPSO – Benchmark Tests • To test effectiveness of MOPAPSO implementation: • Used two MOPSO benchmarks from the literature before applying to OBMS • Configuration: • Nine-node grid running Rocks Cluster Distribution • 5 dual core 2.0 GHz processors with 1GB RAM each • acslX Interpconsole v2.4.1 using MPICH2 • 40 Particles, 100 “Iterations” (100 x 40 = Updates)

  23. MOPAPSO – Benchmark One

  24. MOPAPSO – Benchmark One

  25. MOPAPSO – Benchmark Two

  26. MOPAPSO – Benchmark Two

  27. OBMS Example

  28. Objective Functions

  29. Results

  30. Runtime Improvements

  31. Conclusions • A high-level implementation of MOPAPSO has been developed • Testing of the algorithm on two benchmark problems showed that MOPAPSO can easily locate Pareto-fronts for multi-objective problems • MOPAPSO effectively solved OBMS featuring multiple objectives

  32. Acknowledgements • Ontario Research Fund

More Related