slide1
Download
Skip this Video
Download Presentation
GSRC

Loading in 2 Seconds...

play fullscreen
1 / 48

GSRC - PowerPoint PPT Presentation


  • 122 Views
  • Uploaded on

GSRC. bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan. Outline Motivation, issues in benchmarking bX in the picture Sample application: Evaluation of tools Future focus Contact info, links. Motivation, issues in benchmarking Evaluation

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'GSRC' - patty


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

GSRC

bX update

March 2003

Aaron Ng, Marius Eriksen and Igor MarkovUniversity of Michigan

slide2
Outline
  • Motivation, issues in benchmarking
  • bX in the picture
  • Sample application: Evaluation of tools
  • Future focus
  • Contact info, links
slide3
Motivation, issues in benchmarking
  • Evaluation
    • independent reproduction of results and experiments
    • explicit methods required
      • minimum room for misinterpretation of results
    • evaluation of algorithms across entire problem space
      • conflicting and correlating optimization objectives
      • separation of placement and routing tasks
slide4
Motivation, issues in benchmarking (cont’d)
  • Availability of results
    • raw experimental results
    • availability allows verification
    • results provide insight into the performance of a tool
slide5
Motivation, issues in benchmarking (cont’d)
  • Standard formats
    • meaningful comparison of results
    • compatibility between tools and benchmarks
    • correct interpretation of benchmarks
slide6
bX in the picture
  • Automation
    • ‘live’ repository
      • support for execution of tools on benchmarks
      • distributed network of computational hosts
    • online reporting of results
      • automatic updates when changes in dependencies occur
slide7
bX in the picture (cont’d)
  • Scripts and flows
    • reproduction of results
      • scripts and flows describe experiments
      • scripts can be saved, shared and reused
    • representation of entire problem space
      • relationship between optimization objectives
      • e.g. the effect of placement results on routing
slide8
bX in the picture (cont’d)
  • Standard formats
    • interoperability between tools and benchmarks
    • meaningful comparison of results
slide9
Sample application: Evaluation of tools
  • Placers
    • Capo
      • randomized
      • fixed die placer
      • emphasis on routability
      • tuned on proprietary Cadence benchmarks
slide10
Sample application: Evaluation of tools (cont’d)
  • Placers (cont’d)
    • Dragon
      • randomized
      • variable-die placer
      • tuned on IBM-Place benchmarks
slide11
Sample application: Evaluation of tools (cont’d)
  • Placers (cont’d)
    • KraftWerk
      • deterministic
      • fixed-die placer
      • results typically have cell overlaps
      • additional legalization step by DOMINO
slide12
Sample application: Evaluation of tools (cont’d)
  • Benchmarks
    • PEKO
      • artificial netlists
      • designed to match statistical parameters of IBM netlists
      • known optimal wirelength
      • concern that they are not representative of industry circuits
slide13
Sample application: Evaluation of tools (cont’d)
  • Benchmarks (cont’d)
    • grids
      • 4 fixed vertices, n2 1x1 movables
      • tests placers on datapath-like circuits
      • known optimal placement
      • results are easily visualized for debugging
slide14
Sample application: Evaluation of tools (cont’d)
  • Example flow

A script in bX serves as a template describing an experiment, and can be saved and shared.

Scripts are instantiated by defining the individual components of the script.

Flows are instantiated scripts.

Flows can be re-executed to reproduce results.

post-processing

placement

post-processor

placer

benchmark

evaluation

parameters

evaluator

slide15
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

Flow parameters:

Capo

PEKO

(default)

placement map

overlap/legality & wirelength

After completion, the results of the jobs will be automatically posted online.

In the case of the placement job, the results include wirelength and runtime.

post-processing

placement

placer

post-processor

placer

benchmark

benchmark

parameters

evaluation

post-processor

parameters

evaluator

evaluator

slide16
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

Flow parameters:

Capo

PEKO

(default)

placement map

overlap/legality & wirelength

post-processing

placer

post-processor

benchmark

parameters

post-processor

evaluator

slide17
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

If we swapped Capo with Dragon:

Capo

PEKO

(default)

placement map

overlap/legality & wirelength

placer

benchmark

parameters

post-processor

evaluator

slide18
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

If we swapped Capo with Dragon:

Dragon

PEKO

(default)

placement map

overlap/legality & wirelength

placer

benchmark

parameters

post-processor

evaluator

slide19
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

If we swapped Capo with Dragon:

Dragon

PEKO

(default)

placement map

overlap/legality & wirelength

placer

benchmark

parameters

post-processor

evaluator

slide20
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

Capo

PEKO

(default)

placement map

overlap/legality & wirelength

post-processing

placement

placer

post-processor

placer

benchmark

benchmark

parameters

evaluation

post-processor

parameters

evaluator

evaluator

slide21
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

Modify the flow:

Capo

PEKO

(default)

placement map

overlap/legality & wirelength

post-processing

placement

placer

post-processor

placer

benchmark

benchmark

parameters

evaluation

post-processor

parameters

evaluator

evaluator

slide22
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

Modify the flow:

Capo

grid

(default)

placement map

overlap/legality & wirelength

post-processing

placement

placer

post-processor

placer

benchmark

benchmark

parameters

evaluation

post-processor

parameters

evaluator

evaluator

slide23
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

Modify the flow:

Capo

grid

(default)

grid graph

overlap/legality & wirelength

post-processing

placement

placer

post-processor

placer

benchmark

benchmark

parameters

evaluation

post-processor

parameters

evaluator

evaluator

slide24
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

Modify the flow:

Capo

grid

(default)

grid graph

overlap/legality & wirelength

post-processing

placement

placer

post-processor

placer

benchmark

benchmark

parameters

evaluation

post-processor

parameters

evaluator

evaluator

slide25
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

Swap Capo with Dragon:

Capo

grid

(default)

grid graph

overlap/legality & wirelength

placer

benchmark

parameters

post-processor

evaluator

slide26
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

Swap Capo with Dragon:

Dragon

grid

(default)

grid graph

overlap/legality & wirelength

placer

benchmark

parameters

post-processor

evaluator

slide27
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

Swap Capo with Dragon:

Dragon

grid

(default)

grid graph

overlap/legality & wirelength

placer

benchmark

parameters

post-processor

evaluator

slide28
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

Capo

PEKO

(default)

congestion map

overlap/legality & wirelength

post-processing

placement

placer

post-processor

placer

benchmark

benchmark

evaluation

parameters

parameters

evaluator

post-processor

evaluator

slide29
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

post-processing

placement

placer

post-processor

placer

benchmark

benchmark

evaluation

parameters

parameters

evaluator

post-processor

evaluator

slide30
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

post-processing

placement

placer

post-processor

placer

benchmark

benchmark

evaluation

parameters

parameters

evaluator

post-processor

evaluator

slide31
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

post-processing

placement

placer

post-processor

placer

benchmark

benchmark

evaluation

parameters

parameters

evaluator

post-processor

evaluator

legalization

legalizer

slide32
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

DOMINO

post-processing

placement

placer

post-processor

placer

benchmark

benchmark

evaluation

parameters

parameters

evaluator

post-processor

evaluator

legalization

legalizer

legalizer

slide33
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

DOMINO

post-processing

placement

placer

post-processor

placer

benchmark

benchmark

evaluation

parameters

parameters

evaluator

post-processor

evaluator

legalization

legalizer

legalizer

slide34
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

DOMINO

post-processing

placement

placer

post-processor

placer

benchmark

benchmark

evaluation

parameters

parameters

evaluator

post-processor

evaluator

legalization

legalizer

legalizer

slide35
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

DOMINO

post-processing

placement

placer

post-processor

placer

benchmark

benchmark

evaluation

parameters

parameters

evaluator

post-processor

evaluator

legalization

routing

legalizer

legalizer

router

router

slide36
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

DOMINO

post-processing

placement

placer

post-processor

placer

benchmark

benchmark

evaluation

parameters

parameters

evaluator

post-processor

evaluator

legalization

routing

legalizer

legalizer

router

router

slide37
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

KraftWerk

PEKO

(default)

overlap/legality & wirelength

DOMINO

post-processing

placement

placer

post-processor

placer

benchmark

benchmark

evaluation

parameters

parameters

evaluator

post-processor

evaluator

legalization

routing

legalizer

legalizer

router

router

slide38
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

KraftWerk

PEKO

(default)

overlap/legality & wirelength

DOMINO

placement

evaluation

placer

placer

benchmark

evaluator1

benchmark

parameters

parameters

evaluator1

legalization

legalizer

legalizer

slide39
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

KraftWerk

PEKO

(default)

overlap/legality & wirelength

DOMINO

placement

evaluation

placer

placer

benchmark

evaluator1

benchmark

parameters

evaluation

parameters

evaluator1

evaluator2

legalization

legalizer

legalizer

slide40
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

KraftWerk

PEKO

(default)

overlap/legality & wirelength

routability

DOMINO

placement

evaluation

placer

placer

benchmark

evaluator1

benchmark

parameters

evaluation

parameters

evaluator1

evaluator2

evaluator2

legalization

legalizer

legalizer

slide41
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

KraftWerk

PEKO

(default)

overlap/legality & wirelength

routability

DOMINO

placement

evaluation

placer

placer

benchmark

evaluator1

benchmark

parameters

evaluation

parameters

evaluator1

evaluator2

evaluator2

legalization

evaluation

legalizer

legalizer

evaluator3

slide42
Sample application: Evaluation of tools (cont’d)
  • Example flow (cont’d)

KraftWerk

PEKO

(default)

overlap/legality & wirelength

routability

DOMINO

timing analysis

placement

evaluation

placer

placer

benchmark

evaluator1

benchmark

parameters

evaluation

parameters

evaluator1

evaluator2

evaluator2

legalization

evaluation

legalizer

legalizer

evaluator3

evaluator3

slide43
Future Focus
  • Easy deployment
    • downloadable bX distribution
      • in the form of a binary or installation package
slide44
Future Focus (cont’d)
  • Interpretation of results
    • multiple views and query support
    • for example,
      • ‘show all results for solver S’
      • ‘show the hardest benchmarks for solver S’
      • ‘has the solution quality decreased for benchmark B,

since the upload of the new version of solver S?’

slide45
Future Focus (cont’d)
  • Type checking
    • MIME-like affinity between solvers and benchmarks
      • compatibility checks
      • useful for performing queries on different ‘families’
    • ‘learning’ of new file types
slide46
Future focus (cont’d)
  • GSRC Bookshelf
    • populate bX with implementations from Bookshelf
      • still the same ‘one-stop-shop’
      • except that it will be a live repository
slide47
Future Focus (cont’d)
  • OpenAccess
    • method of communicating data between jobs
      • provide interoperability between tools
    • single ‘design-through-manufacturing’ data model
slide48
Contact info, links

For more info or source code:

[email protected]

Feedback and comments are appreciated.

OpenAccess www.openeda.org

www.cadence.com/feature/open_access.html

GSRC Bookshelf www.gigascale.org/bookshelf

Thanks!