1 / 48

GSRC

GSRC. bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan. Outline Motivation, issues in benchmarking bX in the picture Sample application: Evaluation of tools Future focus Contact info, links. Motivation, issues in benchmarking Evaluation

patty
Download Presentation

GSRC

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor MarkovUniversity of Michigan

  2. Outline • Motivation, issues in benchmarking • bX in the picture • Sample application: Evaluation of tools • Future focus • Contact info, links

  3. Motivation, issues in benchmarking • Evaluation • independent reproduction of results and experiments • explicit methods required • minimum room for misinterpretation of results • evaluation of algorithms across entire problem space • conflicting and correlating optimization objectives • separation of placement and routing tasks

  4. Motivation, issues in benchmarking (cont’d) • Availability of results • raw experimental results • availability allows verification • results provide insight into the performance of a tool

  5. Motivation, issues in benchmarking (cont’d) • Standard formats • meaningful comparison of results • compatibility between tools and benchmarks • correct interpretation of benchmarks

  6. bX in the picture • Automation • ‘live’ repository • support for execution of tools on benchmarks • distributed network of computational hosts • online reporting of results • automatic updates when changes in dependencies occur

  7. bX in the picture (cont’d) • Scripts and flows • reproduction of results • scripts and flows describe experiments • scripts can be saved, shared and reused • representation of entire problem space • relationship between optimization objectives • e.g. the effect of placement results on routing

  8. bX in the picture (cont’d) • Standard formats • interoperability between tools and benchmarks • meaningful comparison of results

  9. Sample application: Evaluation of tools • Placers • Capo • randomized • fixed die placer • emphasis on routability • tuned on proprietary Cadence benchmarks

  10. Sample application: Evaluation of tools (cont’d) • Placers (cont’d) • Dragon • randomized • variable-die placer • tuned on IBM-Place benchmarks

  11. Sample application: Evaluation of tools (cont’d) • Placers (cont’d) • KraftWerk • deterministic • fixed-die placer • results typically have cell overlaps • additional legalization step by DOMINO

  12. Sample application: Evaluation of tools (cont’d) • Benchmarks • PEKO • artificial netlists • designed to match statistical parameters of IBM netlists • known optimal wirelength • concern that they are not representative of industry circuits

  13. Sample application: Evaluation of tools (cont’d) • Benchmarks (cont’d) • grids • 4 fixed vertices, n2 1x1 movables • tests placers on datapath-like circuits • known optimal placement • results are easily visualized for debugging

  14. Sample application: Evaluation of tools (cont’d) • Example flow A script in bX serves as a template describing an experiment, and can be saved and shared. Scripts are instantiated by defining the individual components of the script. Flows are instantiated scripts. Flows can be re-executed to reproduce results. post-processing placement post-processor placer benchmark evaluation parameters evaluator

  15. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) Flow parameters: Capo PEKO (default) placement map overlap/legality & wirelength After completion, the results of the jobs will be automatically posted online. In the case of the placement job, the results include wirelength and runtime. post-processing placement placer post-processor placer benchmark benchmark parameters evaluation post-processor parameters evaluator evaluator

  16. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) Flow parameters: Capo PEKO (default) placement map overlap/legality & wirelength post-processing placer post-processor benchmark parameters post-processor evaluator

  17. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) If we swapped Capo with Dragon: Capo PEKO (default) placement map overlap/legality & wirelength placer benchmark parameters post-processor evaluator

  18. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) If we swapped Capo with Dragon: Dragon PEKO (default) placement map overlap/legality & wirelength placer benchmark parameters post-processor evaluator

  19. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) If we swapped Capo with Dragon: Dragon PEKO (default) placement map overlap/legality & wirelength placer benchmark parameters post-processor evaluator

  20. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) Capo PEKO (default) placement map overlap/legality & wirelength post-processing placement placer post-processor placer benchmark benchmark parameters evaluation post-processor parameters evaluator evaluator

  21. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) Modify the flow: Capo PEKO (default) placement map overlap/legality & wirelength post-processing placement placer post-processor placer benchmark benchmark parameters evaluation post-processor parameters evaluator evaluator

  22. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) Modify the flow: Capo grid (default) placement map overlap/legality & wirelength post-processing placement placer post-processor placer benchmark benchmark parameters evaluation post-processor parameters evaluator evaluator

  23. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) Modify the flow: Capo grid (default) grid graph overlap/legality & wirelength post-processing placement placer post-processor placer benchmark benchmark parameters evaluation post-processor parameters evaluator evaluator

  24. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) Modify the flow: Capo grid (default) grid graph overlap/legality & wirelength post-processing placement placer post-processor placer benchmark benchmark parameters evaluation post-processor parameters evaluator evaluator

  25. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) Swap Capo with Dragon: Capo grid (default) grid graph overlap/legality & wirelength placer benchmark parameters post-processor evaluator

  26. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) Swap Capo with Dragon: Dragon grid (default) grid graph overlap/legality & wirelength placer benchmark parameters post-processor evaluator

  27. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) Swap Capo with Dragon: Dragon grid (default) grid graph overlap/legality & wirelength placer benchmark parameters post-processor evaluator

  28. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) Capo PEKO (default) congestion map overlap/legality & wirelength post-processing placement placer post-processor placer benchmark benchmark evaluation parameters parameters evaluator post-processor evaluator

  29. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) KraftWerk PEKO (default) congestion map overlap/legality & wirelength post-processing placement placer post-processor placer benchmark benchmark evaluation parameters parameters evaluator post-processor evaluator

  30. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) KraftWerk PEKO (default) congestion map overlap/legality & wirelength post-processing placement placer post-processor placer benchmark benchmark evaluation parameters parameters evaluator post-processor evaluator

  31. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) KraftWerk PEKO (default) congestion map overlap/legality & wirelength post-processing placement placer post-processor placer benchmark benchmark evaluation parameters parameters evaluator post-processor evaluator legalization legalizer

  32. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) KraftWerk PEKO (default) congestion map overlap/legality & wirelength DOMINO post-processing placement placer post-processor placer benchmark benchmark evaluation parameters parameters evaluator post-processor evaluator legalization legalizer legalizer

  33. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) KraftWerk PEKO (default) congestion map overlap/legality & wirelength DOMINO post-processing placement placer post-processor placer benchmark benchmark evaluation parameters parameters evaluator post-processor evaluator legalization legalizer legalizer

  34. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) KraftWerk PEKO (default) congestion map overlap/legality & wirelength DOMINO post-processing placement placer post-processor placer benchmark benchmark evaluation parameters parameters evaluator post-processor evaluator legalization legalizer legalizer

  35. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) KraftWerk PEKO (default) congestion map overlap/legality & wirelength DOMINO post-processing placement placer post-processor placer benchmark benchmark evaluation parameters parameters evaluator post-processor evaluator legalization routing legalizer legalizer router router

  36. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) KraftWerk PEKO (default) congestion map overlap/legality & wirelength DOMINO post-processing placement placer post-processor placer benchmark benchmark evaluation parameters parameters evaluator post-processor evaluator legalization routing legalizer legalizer router router

  37. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) KraftWerk PEKO (default) overlap/legality & wirelength DOMINO post-processing placement placer post-processor placer benchmark benchmark evaluation parameters parameters evaluator post-processor evaluator legalization routing legalizer legalizer router router

  38. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) KraftWerk PEKO (default) overlap/legality & wirelength DOMINO placement evaluation placer placer benchmark evaluator1 benchmark parameters parameters evaluator1 legalization legalizer legalizer

  39. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) KraftWerk PEKO (default) overlap/legality & wirelength DOMINO placement evaluation placer placer benchmark evaluator1 benchmark parameters evaluation parameters evaluator1 evaluator2 legalization legalizer legalizer

  40. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) KraftWerk PEKO (default) overlap/legality & wirelength routability DOMINO placement evaluation placer placer benchmark evaluator1 benchmark parameters evaluation parameters evaluator1 evaluator2 evaluator2 legalization legalizer legalizer

  41. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) KraftWerk PEKO (default) overlap/legality & wirelength routability DOMINO placement evaluation placer placer benchmark evaluator1 benchmark parameters evaluation parameters evaluator1 evaluator2 evaluator2 legalization evaluation legalizer legalizer evaluator3

  42. Sample application: Evaluation of tools (cont’d) • Example flow (cont’d) KraftWerk PEKO (default) overlap/legality & wirelength routability DOMINO timing analysis placement evaluation placer placer benchmark evaluator1 benchmark parameters evaluation parameters evaluator1 evaluator2 evaluator2 legalization evaluation legalizer legalizer evaluator3 evaluator3

  43. Future Focus • Easy deployment • downloadable bX distribution • in the form of a binary or installation package

  44. Future Focus (cont’d) • Interpretation of results • multiple views and query support • for example, • ‘show all results for solver S’ • ‘show the hardest benchmarks for solver S’ • ‘has the solution quality decreased for benchmark B, since the upload of the new version of solver S?’

  45. Future Focus (cont’d) • Type checking • MIME-like affinity between solvers and benchmarks • compatibility checks • useful for performing queries on different ‘families’ • ‘learning’ of new file types

  46. Future focus (cont’d) • GSRC Bookshelf • populate bX with implementations from Bookshelf • still the same ‘one-stop-shop’ • except that it will be a live repository

  47. Future Focus (cont’d) • OpenAccess • method of communicating data between jobs • provide interoperability between tools • single ‘design-through-manufacturing’ data model

  48. Contact info, links For more info or source code: bx@umich.edu Feedback and comments are appreciated. OpenAccess www.openeda.org www.cadence.com/feature/open_access.html GSRC Bookshelf www.gigascale.org/bookshelf Thanks!

More Related