1 / 53

Scheduling with Constraint Programming

Scheduling with Constraint Programming. February 24/25, 2000. Today. Optimization Scheduling Assessment. Review. Constraint programming is a framework for integrating three families of algorithms Propagation algorithms Branching algorithms Exploration algorithms.

Download Presentation

Scheduling with Constraint Programming

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Scheduling with Constraint Programming February 24/25, 2000

  2. Today • Optimization • Scheduling • Assessment SMA HPC (c) NUS 15.094 Constraint Programming

  3. Review Constraint programming is a framework for integrating three families of algorithms • Propagation algorithms • Branching algorithms • Exploration algorithms SMA HPC (c) NUS 15.094 Constraint Programming

  4. S E N D+ M O R E= M O N E Y S  {9} E  {4..7} N  {5..8} D  {2..8} M  {1} O  {0} R  {2..8} Y  {2..8} E = 4 E  4 S  {9} E  {5..7} N  {6..8} D  {2..8} M  {1} O  {0} R  {2..8} Y  {2..8} E = 5 E  5 S  {9} E  {5} N  {6} D  {7} M  {1} O  {0} R  {8} Y  {2} S  {9} E  {6..7} N  {7..8} D  {2..8} M  {1} O  {0} R  {2..8} Y  {2..8} E  6 E = 6 SMA HPC (c) NUS 15.094 Constraint Programming

  5. Using OPL Syntax enum Letter {S,E,N,D,M,O,R,Y}; var int l[Letter] in 0..9; solve { alldifferent(l); l[S] <> 0; l[M] <> 0; l[S]*1000 + l[E]*100 + l[N]*10 + l[D] + l[M]*1000 + l[O]*100 + l[R]*10 + l[E] = l[M]*10000 + l[O]*1000 + l[N]*100 + l[E]*10 + l[Y] }; search { forall(i in Letter ordered by increasing dsize(l[i])) tryall(v in 0..9) l[i] = v; }; All Solutions; Execution  Run SMA HPC (c) NUS 15.094 Constraint Programming

  6. Today • Optimization • Scheduling • Assessment SMA HPC (c) NUS 15.094 Constraint Programming

  7. Optimization • Modeling: define optimization function • Propagation algorithms: identify propagation algorithms for optimization function • Branching algorithms: identify branching algorithms that lead to good solutions early • Exploration algorithms: extend existing exploration algorithms to achieve optimization SMA HPC (c) NUS 15.094 Constraint Programming

  8. Optimization: Example SEND + MOST= MONEY SMA HPC (c) NUS 15.094 Constraint Programming

  9. SEND + MOST = MONEY Assign distinct digits to the letters S, E, N, D, M, O, T, Y such that S E N D + M O S T = M O N E Y holds and M O N E Y is maximal. SMA HPC (c) NUS 15.094 Constraint Programming

  10. Modeling Formalize the problem as a constraint optimization problem: • Number of variables: n • Constraints: c1,…,cm  n • Optimization constraints: d1,…,dm  n 2nGiven a solution a, and an optimization constraint di , the constraint di(a) n contains only those assignments b for which b is better than a. SMA HPC (c) NUS 15.094 Constraint Programming

  11. A Model for MONEY • number of variables: 8 • constraints: c1={(S,E,N,D,M,O,T,Y) 8 | 0  S,…,Y  9 } c2={(S,E,N,D,M,O,T,Y) 8 | 1000*S + 100*E + 10*N + D + 1000*M + 100*O + 10*S + T = 10000*M + 1000*O + 100*N + 10*E + Y} SMA HPC (c) NUS 15.094 Constraint Programming

  12. A Model for MONEY(continued) • more constraints c3= {(S,E,N,D,M,O,T,Y) 8 | S  0 } c4= {(S,E,N,D,M,O,T,Y) 8 | M  0 } c5= {(S,E,N,D,M,O,T,Y) 8 | S…Y all different} • optimization constraint: d : (s,e,n,d,m,o,t,y)  {(S,E,N,D,M,O,T,Y) 8 | 10000*m + 1000*o + 100*n + 10*e + y < 10000*M + 1000*O + 100*N + 10*E + Y } SMA HPC (c) NUS 15.094 Constraint Programming

  13. Propagation Algorithms Identify a propagation algorithm to implement the optimization constraints Example: SEND + MOST = MONEY d : (s,e,n,d,m,o,t,y)  {(S,E,N,D,M,O,T,Y)) 88 | 10000*m + 1000*o + 100*n + 10*e + y < 10000*M + 1000*O + 100*N + 10*E + Y } Given a solution a, choose propagation algorithm for d(a). SMA HPC (c) NUS 15.094 Constraint Programming

  14. Branching Algorithms Identify a branching algorithm that finds good solutions early. Example: SEND + MOST = MONEY Idea: Naïve enumeration in the order M, O, N, E, Y. Try highest values first. SMA HPC (c) NUS 15.094 Constraint Programming

  15. Exploration Algorithms Modify exploration such that for each solution a, corresponding optimization constraints are added. Two strategies: • branch-and-bound: continue as in original exploration • restart optimization: after each solution, start from the root. SMA HPC (c) NUS 15.094 Constraint Programming

  16. Using OPL Syntax enum Letter {M,O,N,E,Y,S,D,T}; var int l[Letter] in 0..9; minimize l[M]*10000 + l[O]*1000 + l[N]*100 + l[E]*10 + l[Y] subject to { alldifferent(l); l[S] <> 0; l[M] <> 0; l[S]*1000 + l[E]*100 + l[N]*10 + l[D] + l[M]*1000 + l[O]*100 + l[R]*10 + l[E] = l[M]*10000 + l[O]*1000 + l[N]*100 + l[E]*10 + l[Y] }; search { forall(i in Letter) tryall(v in 0..9 ordered by decreasing v) l[i] = v; }; All Solutions; Execution  Run SMA HPC (c) NUS 15.094 Constraint Programming

  17. Experiments with MONEY MONEY Search MONEY Optimization SMA HPC (c) NUS 15.094 Constraint Programming

  18. Today • Optimization • Scheduling • Assessment SMA HPC (c) NUS 15.094 Constraint Programming

  19. Scheduling • Scheduling Problems • Propagation Algorithms for Resource Constraints • Branching Algorithms • Other Constraints • Exploration Algorithms • Literature SMA HPC (c) NUS 15.094 Constraint Programming

  20. Scheduling Problems Assign starting times (and sometimes durations) to tasks, subject to • resource constraints, • precedence constraints, • idiosyncratic and other constraints, and • an optimization function, typically to minimize overall schedule duration. SMA HPC (c) NUS 15.094 Constraint Programming

  21. Example: Building a Bridge Show Problem Animate Solution Gantt Chart • Resource constraints Example: a1 and a2 use excavator, cannot overlap in time • Precedence constraints Example: p1 requires a3, a3 must end before p1 starts SMA HPC (c) NUS 15.094 Constraint Programming

  22. Modeling • Indices: Task = {begin,a1,a2,a3,a4,a5,a6,p1,...,end} • Constants: duration of tasks duration = #[begin:0, a1:4, a2:2,...]# • Variables: represent each task with a finite domain variable representing its starting time. Activity a[t in Task](duration[t]) a[t].startis finite domain variable representing the starting time ofa[t] SMA HPC (c) NUS 15.094 Constraint Programming

  23. Precedence Constraints For each two tasks t1, t2, where t1 must precede t2, introduce a constraint a[t1].start + a[t1].duration  a[t2].start In OPL, just write a[t1] precedes a[t2] SMA HPC (c) NUS 15.094 Constraint Programming

  24. Resource Constraints For each two tasks t1, t2 that require a unary resource r, we have the constraint a[t1].start + a[t1].duration  a[t2].start \/ a[t2].start + a[t2].duration  a[t1].start But: many constraints and weak propagation. Thus, introduce global resource constraints. SMA HPC (c) NUS 15.094 Constraint Programming

  25. Scheduling • Scheduling Problems • Propagation Algorithms for Resource Constraints • Branching Algorithms • Other Constraints • Exploration Algorithms • Literature SMA HPC (c) NUS 15.094 Constraint Programming

  26. Global Resource Constraints in OPL • Declare resource UnaryResource excavator; • Require resource a[a1] requires excavator; a[a2] requires excavator;... • All “requires” constraints on a resource together form a global resource constraint. SMA HPC (c) NUS 15.094 Constraint Programming

  27. Propagation: Disjunctive Resource For all tasks t1, t2 using the same resource: a[t1].start + a[t1].duration  a[t2].start \/ a[t2].start + a[t2].duration  a[t1].start Weakest but most efficient form of propagation for unary resources. SMA HPC (c) NUS 15.094 Constraint Programming

  28. Propagation: Edge finding For a given task t and set of tasks S, all sharing the same unary resource , find out whether t can occur • before all tasks in S, • after all tasks in S, • between two tasks in S, and infer corresponding basic constraints. SMA HPC (c) NUS 15.094 Constraint Programming

  29. Propagation: Task Intervals Notation: est(t): earliest starting time lct(t): latest completion time For two tasks t1 and t2 where est(t1)est(t2) and lct(t1)  lct(t2), the task interval I(t1,t2) is defined: I(t1,t2) = { t | est(t1)  est(t) and lct(t)  lct(t2) } Perform edge finding on all task intervals. SMA HPC (c) NUS 15.094 Constraint Programming

  30. Task Intervals: Example A B C D I(A,B) = {A,B,C} I(C,D) = {B,C,D} SMA HPC (c) NUS 15.094 Constraint Programming

  31. Which Edge Finder? As usual, trade-off between run-time of propagation algorithm and strength of propagation. Edge finding can be expensive depending on what tasks are considered. Recent edge finders have complexity n2for one propagation step, where n is the number of tasks using the resource. Edge finding based on task intervals can be stronger than these, but typically have complexity n3. SMA HPC (c) NUS 15.094 Constraint Programming

  32. Specifying Propagation Algorithms OPL fixes propagation for unary resources to an (undisclosed) edge-finding algorithm. For discrete (non-unary) resources, the user can choose between default, disjunctive and edgeFinder when declaring a resource: DiscreteResource crane(3) using edgeFinder; SMA HPC (c) NUS 15.094 Constraint Programming

  33. Demo: Propagation for Unary Resources Bridge SMA HPC (c) NUS 15.094 Constraint Programming

  34. Scheduling • Scheduling Problems • Propagation Algorithms for Resource Constraints • Branching Algorithms • Other Constraints • Exploration Algorithms • Literature SMA HPC (c) NUS 15.094 Constraint Programming

  35. Branching Algorithms: Serializers • Simple enumeration techniques such as first-fail are hopeless for scheduling. • Use unary resources to guide branching. • For two tasks t1, t2 sharing the same resource, use the constraints t1.start + t1.duration  t2.start and t2.start + t2.duration  t1.start for branching. SMA HPC (c) NUS 15.094 Constraint Programming

  36. Which Tasks To Serialize? • Resource-oriented serialization: Serialize all tasks of one resource completely, before tasks of another resource are serialized. • Most-used-resource serialization • Global slack • Local slack • Task-oriented serialization: Choose two suitable tasks at a time, regardless of resources. • Slack-based task serialization (see later) SMA HPC (c) NUS 15.094 Constraint Programming

  37. Most-used-resource Serialization “Most-used-resource” serialization: serialize first the resource that is used the most. Let T be a set of tasks running on resource r. demand(T) = tT t.duration Let S be the set of all tasks using resource r. demand(r) := demand(S), Serialize the resource r with maximal demand(r) first. SMA HPC (c) NUS 15.094 Constraint Programming

  38. Slack-based Resource Serialization Let T be a set of tasks running on resource r. supply(T) = lct(T) - est(T) demand(T) = tT t.duration slack(T) = supply(T) - demand(T) Let S be the set of all tasks using resource r. slack(r) := slack(S), S is set of all task running on r. Global slack serialization: Serialize resource with smallest slack first SMA HPC (c) NUS 15.094 Constraint Programming

  39. Local-Slack Resource Serialization Let Ir be all task intervals on r. The local slack is defined as min {slack(I) | I  Ir} Local slack serialization: Serialize resource with smallest local slack first. Use global slack for tie-breaking. SMA HPC (c) NUS 15.094 Constraint Programming

  40. Which Tasks Are Serialized? Ideas: • Use edge finding to look for possible first tasks (and last tasks) • Choose tasks according to their est, lst (lct, ect). SMA HPC (c) NUS 15.094 Constraint Programming

  41. Task-oriented Serialization Among all tasks using all resources, select a pair of tasks according to local/global slack criteria and other considerations, regardless what resources are serialized already. Provides more fine-grained control at the expense of runtime for finding candidate task pairs among all tasks. SMA HPC (c) NUS 15.094 Constraint Programming

  42. Programming Branching Algorithms in OPL • Scheduling-specific try constructs: • tryRankFirst(u,a): activity a comes first • tryRankLast(u,a): activity a comes last • rank(u): serialize all tasks using u • Reflective functions for unary resources: • isRanked(Unary): 1 if the resource is ranked • nbPossibleFirst(Unary): number of activities that can come first • globalSlack(Unary): global slack of u • localSlack(Unary): local slack of u SMA HPC (c) NUS 15.094 Constraint Programming

  43. Example: Task-oriented Serialization in OPL while not isRanked(tool) do select(r in Resources: not isRanked(tool[r])) select(t in tasks[r] : not isRanked(tool[r],a[t]) tryRankFirst(tool[r],a[t]); SMA HPC (c) NUS 15.094 Constraint Programming

  44. Demo: Serialization Algorithms Bridge SMA HPC (c) NUS 15.094 Constraint Programming

  45. Scheduling • Scheduling Problems • Propagation Algorithms for Resource Constraints • Branching Algorithms • Other Constraints • Exploration Algorithms • Literature SMA HPC (c) NUS 15.094 Constraint Programming

  46. Other Scheduling Constraints • Discrete resources DiscreteResource crane(3); a requires(2) crane; a consumes(1) crane; • Reservoirs Reservoir plumbing(3,1); a requires(2) plumbing; a consumes(2) plumbing; a provides(2) plumbing; a produces(2) plumbing; SMA HPC (c) NUS 15.094 Constraint Programming

  47. Exploration Algorithms • Usually: branch-and-bound using minimization of the starting time of an “end” task minimize a[end].start subject to {...} • Lower-bounding: Prove the non-existence of a solution with a[end].start  n, add constraint a[end].start > n; increase n by . • Upper-bounding: Find solution with a[end].start = n, add constraint a[end].start<n; decrease n by . SMA HPC (c) NUS 15.094 Constraint Programming

  48. Literature • Van Hentenryck: The OPL optimization programming language, 1999 • Constraint Programming Tutorial of Mozart, 2000 (www.mozart-oz.org) • Applegate, Cook: A computational study of the job-shop scheduling problem, 1991 • Carlier, Pinson: An algorithm for solving the job-shop scheduling problem, 1989 • Various papers by Laburthe, Caseau, Baptiste, Le Pape, Nuijten, see “Overview” paper SMA HPC (c) NUS 15.094 Constraint Programming

  49. Today • Optimization • Scheduling • Assessment SMA HPC (c) NUS 15.094 Constraint Programming

  50. Assessment: Don’t Use It! Don’t use constraint programming for: • Problems for which there are known efficient algorithms or heuristics. Example: Traveling salesman. • Problems for which integer programming works well. Example: Many discrete assignment problems. • Problems with weak constraints and a complex optimization function. Example: Timetabling problems. SMA HPC (c) NUS 15.094 Constraint Programming

More Related