1 / 27

Gradual Relaxation Techniques with Applications to Behavioral Synthesis

Gradual Relaxation Techniques with Applications to Behavioral Synthesis. Zhiru Zhang, Yiping Fan , Miodrag Potkonjak, Jason Cong. Department of Computer Science University of California, Los Angeles. Partially supported by NSF under reward CCR-0096383. Outline. Motivations & objectives

iago
Download Presentation

Gradual Relaxation Techniques with Applications to Behavioral Synthesis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Gradual Relaxation Techniques with Applications to Behavioral Synthesis Zhiru Zhang, Yiping Fan, Miodrag Potkonjak, Jason Cong Department of Computer Science University of California, Los Angeles Partially supported by NSF under reward CCR-0096383

  2. Outline • Motivations & objectives • Gradual relaxation techniques • Driver example: Time-Constrained Scheduling • Other driver examples • Maximum Independent Set (MIS) • Soft Real-Time Scheduling • Experimental results • Conclusions

  3. Motivations & Objectives • Motivations: • Many synthesis problems are computational intractable • SAT, scheduling, graph coloring, … • Lack of systematical way to develop effective heuristics • Objectives: • Development of a new general heuristic paradigm • Gradual Relaxation • Applications to a wide range of synthesis problems

  4. Gradual Relaxation Techniques • Most constrained principle • Minimal freedom reduction • Negative thinking • Compounding variables • Simultaneous step consideration • Calibration • Probabilistic modeling

  5. Driver Example: Time-Constrained Scheduling (1) • Problem: Time-constrained scheduling • Given: • (1) A CDFG G(V, E) • (2) A time constraint T • Objective: • Schedule the operations of V into T cycles so that the resource usage is minimized and all precedence constraints in G are satisfied

  6. Driver Example: Time-Constrained Scheduling (2) • Related work • M. C. McFarland, A. C. Parker, and R. Camposano, Proc. of IEEE, 1990 • G. D. Micheli, 1994 • E. A. Lee and D. G. Messerschmitt, Proc. of IEEE, 1987, SDF scheduling • Classical approach – Force-Directed Scheduling • P. G. Paulin and J. P. Knight, DAC 1987 • Exploit schedule freedom (slack) to minimize the hardware resources • Iterative approach: schedule one operation per iteration

  7. * * * * + C-step 1 * * < + C-step 2 - C-step 3 1/2 - C-step 4 1/3 Time Frames 0 1 2 3 4 0 1 2 3 4 1 2 3 4 1 2 3 4 DG for Multiply DG for Add, Sub, Comp Driver Example: Time-Constrained Scheduling (3) * * * * + • Determine ASAP & ALAP Schedules • Determine Time Frame of each operation • Length of box : Possible execution cycles • Width of box: Probability of assignment • Uniform distribution, Area assigned = 1 • Create Distribution Graphs • Sum of probabilities of each Op type • Indicates concurrency of similar Ops DG(i) =  Prob(Op, i) * * * * + < * * - - * * + - + < - ASAP ALAP

  8. Most Constrained Principle • Principle: • First resolve the most constrained components • Minimally impact the difficulty of still unresolved constraints • Related work: • General technique • Bitner and Reigold, 1975; Brelaz, 1979, for graph coloring • Pearl, 1984, intelligent search • Slack based heuristics • Davis, Tindell, and Burns, 1993; Gldwasser, 2003 • Force-directed scheduling • Paulin and Knight, 1989

  9. a* b* c * d * e + C-step 1 a* b* d* e + C-step 1 f* g * i < h + C-step 2 c* f* i < h+ C-step 2 j- C-step 3 g* j- C-step 3 1/2 k- C-step 4 k- C-step 4 1/3 0 1 2 3 4 1 2 3 4 Most Constrained Principle:Time-Constrained Scheduling • Operation Op, at control step i, targeting control step t • Force(Op, i, t) = DG(i) * x(Op, i, t) • x(Op, i, t): the Prob change in i when Op is scheduled to t • The self force of operation Op w.r.t control step t • Self Force(Op, t) = itime frame Force(Op, i, t) c * d * 1/3 0 1 2 3 4 1 2 3 4

  10. Minimal Freedom Reduction / Negative Thinking (1) • Minimal Freedom Reduction –key of a good heuristic: • To avoid the greedy behavior of optimization • Make a small gradual atomic decision • Evaluate its individual impact before committing to large decisions • Negative Thinking –way to realize Minimal Freedom Reduction • Traditional heuristics resolve a specific component of the solution • Negative thinking determines what will not be considered as a component of the solution

  11. Minimal Freedom Reduction / Negative Thinking (2) • Similar ideas: • Improved Force-Directed scheduling: • W. F. J. Verhaegh, P. E. R. Lippens, E. H. L. Aarts, J. H. M. Korst, J. L. van Meerbergen, and A. van der Werf, IEEE Trans. on Computer-Aided Design of Integrated Circuits and Systems, 1995 • Gradually shrink operations’ time fames • Standard cell global routing: • J. Cong and Patrick H. Madden, ISPD, 1997 • Iterative deletion method – from the complete routing graph, delete edges one by one to get an optimum routing tree

  12. a* b* c * e + C-step 1 d * f* g * i < C-step 2 h + j- C-step 3 1/2 k- C-step 4 1/3 Time Frames 0 1 2 3 4 0 1 2 3 4 1 2 3 4 1 2 3 4 DG for Multiply DG for Add, Sub, Comp Negative Thinking:Time-Constrained Scheduling • Traditional FDS: • Select minimum force(Op, t), scheduleOp to t • Negative thinking FDS: • Select maximum force(Op, t), removet from Op’s time frame d * a* b* c* d * e + C-step 1 f* g* i < h + C-step 2 j- C-step 3 1/2 k- C-step 4 1/3 Time Frames 0 1 2 3 4 0 1 2 3 4 1 2 3 4 1 2 3 4 DG for Multiply DG for Add, Sub, Comp

  13. Compounding Variables /Simultaneous Steps Consideration (1) • Compounding variables • For the problems where variables can be assigned only to binary values • Combine several variables together • Simultaneous steps consideration • Consider a small negative decision on a set of variables simultaneously • Example: a SAT instance • Compound x1 and x2, there are 4 assignment options • Evaluate their impacts to the maximum constraints • Negative thinking: remove one option, keep the other three promising options

  14. * * * + C-step 1 d * * * < C-step 2 h + - C-step 3 1/2 - C-step 4 1/3 Calibration • Heuristics conduct the optimization • Keep the options for important variables • Discard the options for unimportant variables • Example: • In resource-minimization scheduling: • Multipliers are much more expensive than adders • Preserve maximum slacks for the multiplications • Lower the priority to minimize required adders * * * * + d* C-step 1 * * < h + C-step 2 - C-step 3 1/2 - C-step 4 1/3

  15. prob(1,1) = 0.6 • prob(1,2) = 0.3 • prob(1,3) = 0.1 c-step1 1 1 1 1 1 1 c-step2 2 2 2 1 1 1 1 C-step 1 3 2 2 2 2 1 c-step3 2 C-step 2 c-step4 3 3 3 2 3 2 2 C-step 3 C-step 4 c-step5 3 3 3 3 3 3 C-step 4 Probabilistic Modeling • Options of every variable are non-uniformly distributed • Probabilistic modeling • A non-uniform function of all constraints imposed on a particular variable

  16. When is Gradual Relaxation Most Effective? • Minimal freedom reduction / Negative thinking • A large number of variables have significant slack • Variables have complex interactions among a large number of constraints • Compounding variables / simultaneous stepsconsideration • Each variable has a small set of potential values • Calibration • The final solution only involves relatively few types of resources • Probabilistic modeling • Effective for large and complex instances

  17. Driver Example: Maximum Independent Set (1) • Problem: Maximum Independent Set • Given: G (V, E) • Objective: find a maximum-size independent set V’V, such that for uV’ and vV’, (u, v)  E. • Related work • A popular generic NP-Complete problem • M. R. Garey and D. S. Johnson, 1979 • Useful for efficient graph coloring • D. Kirovski and M. Potkonjak, DAC 1998

  18. Driver Example: Maximum Independent Set (2) • Reasoning: • In practice, MIS size is much smaller than the total graph size • A smaller decision: • To select a most constrained vertex not to be in the MIS • Simple heuristic: h1(v) = Number of Neighbors of v • Look-forward heuristic: h2 (v) = uNeighbors (v) (1 / Number of Neighbors of u)

  19. Driver Example: Soft Real-Time Scheduling (1) • Problem: Soft real-time scheduling • Given: • (1) A set of non-preemptive tasks ={1 ,2 ,…n} and each taski=(ai , di , ei) is characterized by an arrival time ai, a deadline diand an execution time ei • (2)A single processor P • (3)A timing constraint T • Objective: • Schedule a subset of tasks in  on processor P within the available time T so that the number of tasks scheduled is maximized

  20. Driver Example: Soft Real-Time Scheduling (2) • Multimedia applications • B. Kao and H. Garcia-Molina, 1994 • B. Adelberg, H. Garcia-Molina, and B. Kao, 1994; • Video and WWW servers • M. Jones, D. Rosu, M.-C Rosu, 1997 • Formal definition • P. D’Argenio, J.-P Katoen, and E. Brinksma, 1999 • CAD and embedded systems • D. Ziegenbein, J. Uerpmann, and R. Ernst, ICCAD 2000 • D. Verkest, P. Yang, C. Wong, and P. Marchal, ICCAD 2001 • K. Richter, D. Ziegenbein, M. Jersak, and R. Ernst, DAC 2002

  21. prob(i,t) t si si+ei ci-ei ci Driver Example: Soft Real-Time Scheduling (3) • Two phase heuristic: • Conflict minimization • Gradually shrink the time frame for every task • Legalization • Probabilistic modeling: • Trapezoid shape Task Probability Distribution

  22. Driver Example: Soft Real-Time Scheduling (4) • Objective: • Minimize the number of conflicts • Repeat until all tasks are locked • Update distribution graph • Compute forces for every tasks at the start and cutoff time slots • Select the maximumforce (T, t), remove time slot t from task T’s time frame Task.Prob Time Slot Task.Prob Time Slot

  23. Experimental Results:Maximum Independent Set • Apply to DIMACS benchmark graphs for the Clique problem challenge • Compare to a state-of-the-art iterative algorithm • MIS algorithm used in D. Kirovski and M. Potkonjak, DAC 1998 • Similar quality • Much faster: 50X using h1, 30X using h2 • Look-forward heuristic outperforms the simple version

  24. Experimental Results:Time-Constrained Scheduling (1) • Scheduling results comparison under critical-path time constraint

  25. Experimental Results:Time-Constrained Scheduling (2) • Scheduling results comparison under time constraint with 1.5x critical path length

  26. Experimental Results:Soft Real-Time Scheduling • Soft real-time scheduling results

  27. Conclusions • Development of gradual relaxation techniques • Most constrained principle • Minimal freedom reduction • Negative thinking • Compounding variables • Simultaneous step consideration • Calibration • Probabilistic modeling • Applications to: • Maximum independent set • Time-constrained scheduling • Soft real-time scheduling

More Related