1 / 25

Flow Shop Scheduling

Flow Shop Scheduling. Definitions. Contains m different machines. Each job consists m operators in different machine. The flow of work is unidirectional. Machines in a flow shop = 1,2,…….,m The operations of job i , (i,1) (i,2) (i ,3)…..(i, m)

stacia
Download Presentation

Flow Shop Scheduling

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Flow Shop Scheduling

  2. Definitions • Contains m different machines. • Each job consists m operators in different machine. • The flow of work is unidirectional. • Machines in a flow shop = 1,2,…….,m • The operations of job i , (i,1) (i,2) (i ,3)…..(i, m) • Not processed by machine k , P( i , k) = 0

  3. Flow Shop Scheduling Baker p.136 The processing sequence on each machine are all the same. 1 2 . . . . . M  Flow shop  Job shop n! - flow shop permutation schedule n!.n! …….n! - Job shop k : constraint (∵ routing problem) or

  4. Workflow in a flow shop Baker p.137 Type 1. Input Machine 1 Machine 2 Machine 3 …. Machine M-1 Machine M output Type 2. Input Input Input Input Input Machine 1 Machine 2 Machine 3 …. Machine M-1 Machine M output output output output output

  5. Johnson’s Rule Baker p.142 Note: Johnson’s rule can find an optimum with two machines Flow shop problem for makespan problem.

  6. Ex.

  7. Ex. 3 1 4 5 2 M1 M2 3 1 4 5 2 24 The makespan is 24

  8. The B&B for Makespan Problem The Ignall-Schrage Algorithm (Baker p.149) • A lower bound on the makespan associated with any completion of the corresponding partial sequence σ is obtained by considering the work remaining on each machine. To illustrate the procedure for m=3. For a given partial sequence σ, let q1= the latest completion time on machine 1 among jobs in σ. q2= the latest completion time on machine 2 among jobs in σ. q3= the latest completion time on machine 3 among jobs in σ. The amount of processing yet required of machine 1 is

  9. The Ignall-Schrage Algorithm In the most favorable situation, the last job • Encounters no delay between the completion of one operation and the start of its direct successor, and • Has the minimal sum (tj2+tj3) among jobs j belongs to σ’ Hence one lower bound on the makespan is A second lower bound on machine 2 is A lower bound on machine 3 is The lower bound proposed by Ignall and Schrage is

  10. The Ignall-Schrage Algorithm Job in σ’ M1 …….. tk1 q1 M2 …….. tk2 q2 M3 …….. tk3 q3 b1 M2 …….. tk2 q2 M3 …….. tk3 q3 b2

  11. Ex. B&B m=3 For the first node: σ =1

  12. Ex. 1 2 0 3 14 1 2 7 15 1 2 17 22

  13. Ex. B&B P0 1xxx 2xxx 3xxx 4xxx B=37 B=45 B=46 B=52 12xx 13xx 14xx B=45 B=39 B=45 132x 134x B=45 B=39

  14. Refined Bounds The use of q2 in the calculation of b2 ignores the possibility that the starting time of job j on the machine 2 may be constrained by commitments on machine1. Hence: Modification1: consider idle time

  15. Refined Bounds Modification2: (McMahon and Burton) Previous : Machine-based bound Modification2 : Job-based bound

  16. Refined Bounds Obviously, B’>=B, This means that the combination of machine-based and job-based bounds represented by B’ will lead to a more efficient search of the branching tree in the sense that fewer nodes will be created.

  17. Hw. Consider the following four-job three-machine problem • Find the min makespan using the basic Ignall-Schrage algorithm. Count the nodes generated by the branching process. • Find the min makespan using the modified algorithm.

  18. Heuristic Approaches Traditional B&B: • The computational requirements will be severe for large problems • Even for relatively small problems, there is no guarantee that the solution can be obtained quickly, Heuristic Approaches • can obtain solutions to large problems with limited computational effort. • Computational requirements are predictable for problem of a given size.

  19. Palmer Palmer proposed the calculation of a slope index, sj, for each job. Then a permutation schedule is constructed using the job ordering

  20. Gupta Gupta thought a transitive job ordering in the form of follows that would produce good schedules. Where Where

  21. Gupta Generalizing from this structure, Gupta proposed that for m>3, the job index to be calculated is Where

  22. CDS Its strength lies in two properties: • It use Johnson’s rule in a heuristic fashion • It generally creates several schedules from which a “best” schedule can be chosen. The CDS algorithm corresponds to a multistage use if Johnson’s rule applied to a new problem, derived from the original, with processing times and . At stage 1,

  23. CDS In other words, Johnson’s rule is applied to the first and mth operations and intermediate operations are ignored. At stage 2, That is, Johnson’s rule is applied to the sums of the first two and last two operation processing times. In general at stage i,

  24. Ex. Palmer: Gupta: CDS: 3-5-4-1-2 M=35

  25. HW. Let • Use Ignall-Schrage & McMahon-Burton to solve • Use Palmer, Gupta, CDS to solve this problem.

More Related