1 / 100

Set # 3 Dr. LEE Heung Wing Joseph Email : majlee@polyu.hk Phone: 2766 6951 Office : HJ639

AMA522 SCHEDULING. Set # 3 Dr. LEE Heung Wing Joseph Email : majlee@polyu.edu.hk Phone: 2766 6951 Office : HJ639. Recall Dynamic Programming and 1 ||  T j Algorithm 3.4.4.

otto-cote
Download Presentation

Set # 3 Dr. LEE Heung Wing Joseph Email : majlee@polyu.hk Phone: 2766 6951 Office : HJ639

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. AMA522 SCHEDULING Set # 3 Dr. LEE Heung Wing Joseph Email : majlee@polyu.edu.hk Phone: 2766 6951 Office : HJ639

  2. Recall Dynamic Programming and 1 || Tj Algorithm 3.4.4. Dynamic programming procedure: recursively the optimal solution forsome job set J starting at time t is determined from the optimal solutionsto subproblems defined by job subsets of S*S with start times t*t . J(j, l, k) contains all the jobs in a set {j, j+1, ... , l} with processing time pk V( J(j, l, k) , t) total tardiness of the subset under an optimal sequence if this subset starts at time t

  3. Initial conditions: V(, t) = 0 V( { j }, t ) = max (0, t + pj - dj) Recursive conditions: where k' is such that pk' = max ( pj' | j' J(j, l, k) ) Optimal value function is obtained for V( { 1,...,n }, 0)

  4. Recall Example

  5. V( J(1, 4, 3) , 0)=0 achieved with the sequence 1, 2, 4 and 2, 1, 4

  6. Rough estimation of worst case computation The worst case computation time required by this algorithm can “roughly” be established as follows: There are at most O(n3) subsets J(j,l,k) . {Choose a number “k” out of n jobs first. Then choose a pair (i , j) with i<j out of the remaining n-1 jobs, there are (n-1)(n-2)/2 ways. So, n(n-1)(n-2)/2 subsets.} There are at most pjpoints in time t. There are therefore at most O(n3pj) recursive equations to be solved in the dynamic programming algorithm.

  7. As each recursive equation takes O(n) time, the overall Running time of the algorithm is bounded by O(n4pj), which is clearly a polynomial in n. (Pseudopolynomial). Pseudopolynomial O(n4pj) Suppose there are two scheduling problems 1 || Tjwith the same number of jobs n, and same due dates. Does it make sense to say the one with larger completion time will have a larger upper bound of computational time ?

  8. Lemma Consider the problem 1 || Tj with n jobs. The jobs can be scheduled with zero total tardiness if and only if the EDD schedule results in a zero total tardiness. Proof Since the smallest possible value 1 || Tj can take is zero, if the EDD schedule results in a zero total tardiness, then these n jobs can be schedule with zero total tardiness. Suppose, without loss of generality, , and jobs can be scheduled so that 1 || Tj =0, i.e. Tj=0 for all j=1,2,…,n. Let j < k , so .

  9. Suppose job k is scheduled before j . Tardiness of job k and j : So,

  10. but, Therefore, if we swap job j and k, the tardiness would still be 0

  11. So, we can keep swapping any pairs of jobs k and j with k > j and k before j in the schedule (so that job j is processed before k) until we have EDD schedule without any increase in (zero) total tardiness. □ Alternatively, we can use Lawler’s Algorithm for for 1 | | hmax and Theorem 3.2.2. to get more insights. Since hmax= max ( h1(C1), ... ,hn(Cn) ),hjare some non-decreasing cost functions, let hi=Ti=max(Ci-di , 0) . Thus, hmax is the maximim tardiness Tmax . It can be shown that Lawler’s Algorithm results in the EDD rule.

  12. RecallLawler’s Algorithm for 1 | | hmax Step 1. J =  JC = {1,...,n} k = n Step 2. Let j* be such that Place j* in J in the k-th order position Delete j* from JC Step 3. If JC =  then Stop else k = k - 1 go to Step 2

  13. 1 || Tmax is the special case of the 1 | prec | hmax where hj = Tj=max( Cj-dj , 0). Lawler’s algorithm results in the schedule that orders jobs in increasing order of their due dates - earliest due date first rule (EDD) Thus, the EDD rule minimize maximum tardiness Tmax . Therefore, zero total tardiness Tj =0 implies zero maximum tardiness Tmax =0 implies EDD rule results in zero tardiness.

  14. Let Tj(EDD) be the tardiness of job j under the EDD schedule, Tmax(EDD)=maxj{Tj(EDD)} ,and let Tj(opt) be the tardiness of job j under the optimal schedule in the sense of 1 || Tj. Lemma Suppose Proof Let k be the job that under (EDD), Tk(EDD)=Tmax(EDD) . Let λ be the last job of the (opt) schedule.

  15. > k k δ* λ k > k λ Three cases to consider: Case I : If λ≤ k . Case II : If k < λand there exists δ ( δ < k ) such that under (opt), job δ is scheduled after job k but before λ. Let δ* be the last of such a job in (opt) schedule so that no other job with a job number larger than k is scheduled after δ* . Case III : If k < λ , and no job with a job number larger than k is scheduled after job k in (opt).

  16. Case I : If λ≤ k , so dλ≤dk , then

  17. > k k δ* λ Case II : If k < λand there exists δ ( δ < k ) such that under (opt), job δ is scheduled after job k but before λ. Let δ* be the last of such a job in (opt) schedule so that no other job with a job number larger than k is scheduled after δ* .

  18. k > k λ Case III : If k < λ , and no job with a job number larger than k is scheduled after job k in (opt).

  19. Let Tj(EDD) be the tardiness of job j under the EDD schedule, Tmax(EDD)=maxj{Tj(EDD)} ,and let Tj(opt) be the tardiness of job j under the optimal schedule in the sense of 1 || Tj. Lemma

  20. Lemma Suppose sequence S minimize problem 1 || Tjwith processing times pjand due dates dj. Then, sequence S also minimize the rescaled scheduling problem with provessing times Kpi and rescaled due dates Kdj for some positive constant K . Proof Consider the total tardiness : Clearly, minimizing total tardiness and are equivalent.

  21. Lemma Consider two scheduling problems of minimizing total tardiness. The first problem has processing times pi and due dates di, whereas, the second problem has processing time qi and due dates di. Let pi qi for all i. The optimal total tardiness of the first problem is less than or equal to the optimal total tardiness of the second problem. Proof Consider the total tardiness of optimal solution to the second problem:

  22. Note that,

  23. Lemma (Exercise 4.11) Consider the problem of 1 | dj=d | Ej +TjwhereEj=max{dj-Cj , 0}is the earliness of job j. The n jobs have to be processed without interruption (i.e. there should be no unforced idleness in between the processing of the jobs). Proof Suppose sequence S is optimal but there is a processing gap between two jobs j and k where job k is processed after job j immediately after the time gap. Three cases to consider:

  24. d j timegap k Let J1 be the set of jobs processed before the time gap, and J2 be the set of jobs processed after the time gap. Let t0 be the time where first job starts, and t1 be the length of the time gab. Case I : If the timegap is beyond d.

  25. d j timegap k Case II : If the timegap is before d.

  26. timegap d j k Case III : If the timegap cover (include) d.

  27. Cost for jobs processed after the time gap Case I : If the timegap is beyond d. Cost for jobs processed before the time gap Since the timegap is beyond d, thus, Therefore reduce t1 to zero reduce total cost. So the original sequence S is not optimal.

  28. Cost for jobs processed after the time gap d j k

  29. Cost for jobs processed before the time gap Cost for jobs processed after the time gap Case II : If the timegap is before d. Since the timegap is before d, thus, Therefore increase t0 to (t0 +t1) and reduce t1 to zero will reduce total cost. So the original sequence S is not optimal.

  30. d j k Cost for jobs processed before the time gap (decreased) Cost for jobs processed after the time gap (remain unchanged)

  31. d j k Case III : If the timegap cover (include) d. α β

  32. d j k Hence, increase t0 to (t0 +α) would make Therefore increase t0 to (t0 +α) and reduce t1 to zero will reduce total cost. So the original sequence S is not optimal. By contradiction of the 3 cases, the original sequence S is not optimal !!!□

  33. Lemma Consider the problem of 1 | dj=d | Ej +Tj. In an optimal schedule for n≥3, there are two set of jobs. The set of early jobs, J1≠ , and the set of late jobs, J2 ≠ . Proof Suppose in an optimal sequence that all the jobs are scheduled early. Let j be the earliest job. Total Cost: =

  34. d J1 j j Suppose d-t0 > 2pj. Thus, |t0+pj-d| > |pj|. Now, re-schedule job j to start at d, and the starting time for all other jobs remain unchanged. So for the remaining jobs, J1 becomes J1\{j}, and t0 becomes t0+pj.

  35. Total Cost : = = So, by contradiction, the original sequence that all jobs are early is not optimal. What if d-t0 < 2pj ?

  36. d j j Similarly, we can show by contradiction that, if the original sequence consists of all jobs that are late, is not optimal. □

  37. Lemma 4.2.1 Consider the problem of 1 | dj=d | Ej +Tj. In an optimal schedule, the early jobs, set J1, are scheduled according to LPT, and the late jobs, set J2 , are scheduled according to SPT. Proof (Exercise 4.12) Observe that all jobs in J1 do not contribute to Tj and all jobs in J2 do not contribute to Ej . Let |J1| be the number of jobs in J1 and |J2| be the number of jobs in J2. Also, observe that for J1, the only cost contribution is: Ej=  max{dj-Cj , 0}= (|J1|d)-(Cj), since all jobs are early.

  38. Similarly, for J2, the only cost contribution is: Tj=  max{Cj - dj, 0}= (Cj)-(|J2|d) since all jobs are late. Thus, for all late jobs (all jobs in J2), we try to minimize the total flow Cj. Hence, among all jobs in J2, we use SPT (Theorem 3.1.1). What is left is to show that among all early jobs (all jobs in J1), the LPT rule minimize (-Cj), the negative total flow, or, LPT maximize the total flow. This is left as an exercise to you.

  39. Lemma 4.2.2 Consider 1 | dj=d | Ej +Tj. In an optimal schedule, there exists an optimal schedule in which one job is completed exactly at time d. Proof Suppose there are no such optimal schedule. There exists one job that starts its processing before d and completes it processing after d. Call this job j*. d J* α β

  40. d J* β Let |J1| be the number of early jobs and |J2| be the number of late jobs. If |J1| < |J2| , then shift the entire schedule to the left such that job j* completes its processing exactly at time d.

  41. The total tardiness: decreased by β [ |J2|-1] The total earliness: increased by β |J1| But |J1| < |J2|, i.e. |J1| ≤ |J2|-1, thus, β|J1| ≤β[|J2|-1], so the total cost is decreased. If |J1| > |J2| , then shift the entire schedule to the right such that job j* starts its processing exactly at time d. d J* α

  42. The total tardiness: increased by α |J2| The total earliness: decreased by α [ |J1|-1 ] But |J1| > |J2|, i.e. |J1|-1 ≥ |J2|, thus, α[|J1|-1] ≥α|J2|, so the total cost is decreased. If |J1| = |J2| , then there are many optimal schedules, of which only two satisfy the property stated in the Lemma. Why ? This is left as an exercise to you. Assuming

  43. Assuming

More Related