1 / 27

A Random Parallel Approximation Algorithm for the 0-1 Knapsack Problem

A Random Parallel Approximation Algorithm for the 0-1 Knapsack Problem. Michael Adams. Agenda. Problem Description Related Work Exact Solution Algorithm Design Heuristic and Random Approximation Algorithm design Results and Analysis Research Answers Future Work Lessons Learned.

tino
Download Presentation

A Random Parallel Approximation Algorithm for the 0-1 Knapsack Problem

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Random Parallel Approximation Algorithm for the 0-1 Knapsack Problem Michael Adams

  2. Agenda Problem Description Related Work Exact Solution Algorithm Design Heuristic and Random Approximation Algorithm design Results and Analysis Research Answers Future Work Lessons Learned

  3. Problem Description Items with weights and values Knapsack with weight capacity Highest value selection under capacity NP-hard Combinatorial optimization Reference [1]

  4. Quality-Up Low value answer in low Q% Run many copies of random algorithm Keep highest Q

  5. Related Work • Exact algorithm • Lists and dominance [2] • Essentially a brute force with trimming • Would require complex bookkeeping • Heuristic • Dual population co-evolution genetic algorithm [3] • Greedy decreasing profit density [4] • Random approximation is new research • Quality up is new research

  6. Item Generation • Given N, LW, UW, LV, UV, seed • for each id in range 0..N-1 • weight = random in [LW, UW] • value = random in [LV, UV] • add new item to list with id • Can be tightly or loosely correlated

  7. Sequential Exact Algorithm Design • Exhaustive brute force search • for every possible combination of items • evaluate the combination • if(total weight > capacity) • go to next combination • if(total value > current best) • set combination as best • Must perform 2N evaluations

  8. Parallel Exact Algorithm Design Divvy up evaluations to available CPUs Theoretically K times faster Reduce answer and store highest (exact)

  9. Decreasing Profit Density Heuristic • sort items by decreasing profit density • start with empty assignment • for each item in order • if(total weight plus item > capacity) • stop • add item weight to total weight • add item value to total value • make note of item in assignment

  10. Random Approximation Algorithm Start with a random combination of items Make slight adjustments to improve value Stop when improvement ends Hill-climbing One iteration = onerun of the algorithm Reference [5]

  11. Random Approximation Algorithm Design • use unique seed to get a pseudo random number generator • for(this thread’s iterations) • use prng to get a random combination of items • perform addRemove on assignment OR • perform bitFlip on assignment OR • perform both on assignment

  12. Add/Remove Algorithm • while item combination weight > capacity • Remove random included item from combination • while item combination weight < capacity • Get random item that was excluded • if adding it would overflow the capacity • stop • add that item to the combination

  13. BitFlip Algorithm • while assignment evaluation improves • start with the current best assignment • for each i in range 0..N-1 • if i is included, remove it • else if i is excluded, add it • if this is better than current best • set assignment as best

  14. Data – Loosely Correlated java KnapsackExact 27 50 500 50 500 4250 2314156 N R AV T Q S 27 exact 0000000007B0E7AC6043 29524 27 parallel 0000000007B0E7AC6043 937 31.5 27 heuristic 0000000007B0F7A85975 0 0.989 27 BF40 0000000007B0EBAC5900 1270.976 232.5 27 BF400 0000000007B0EBAC5900 160 0.976 184.5 27 BF4000 0000000007B0EBAC5900 169 0.976 174.7 27 BF40000 0000000007B0EBAC5900 271 0.976 108.9 27 BF400000 000000000730F3BC5947 282 0.984 104.7 27 BF4000000 000000000730F3BC5947 872 0.984 33.9 27 AR40 0000000005A975B25137121 0.850 244.0 27 AR400 0000000005AAE78C5360 136 0.887217.1 27 AR4000 000000000338E7AD5682 212 0.940 139.3 27 AR40000 0000000007F8EFA05853 230 0.969 128.4 27 AR400000 00000000073CF7A85918 291 0.979 101.5 27 AR4000000 0000000007B0E7B86018 505 0.996 58.5 27 BOTH40 0000000007B06B2E5325 124 0.881 238.1 27 BOTH400 0000000005BAE78C5617 157 0.930188.1 27 BOTH4000 0000000005B8F3F05697 177 0.943 166.8 27 BOTH40000 0000000007F8EFA05853 242 0.969 122.0 27 BOTH400000 000000000778F7A4 5925 323 0.980 91.4 27 BOTH4000000 0000000007B0E7AC 6043 1125 1.000 26.2

  15. Quality versus Iterations

  16. Speedup versus Quality

  17. Data – Tightly Correlated java KnapsackExact 31 50 500 5500 9422134 N R A V T Q S 31 exact 000000002B63AF9E5542 538206 31 parallel 000000002B63AF9E5542 14613 36.8 31 heuristic 000000002B63ADDC5186 0 0.936 31 BF40 000000004BC77EB055161420.995 3790.2 31 BF400 000000004BC77EB05516154 0.995 3494.8 31 BF4000 000000002FE1B3645526158 0.997 3406.4 31 BF40000 000000002FE1B3645526 191 0.997 2817.8 31 BF400000 0000000023C7B7C65530 311 0.998 1730.6 31 BF4000000 000000002B436FFC5536 1048 0.999 513.6 31 AR40 000000005E6CBE5C5510135 0.994 3986.7 31 AR400 000000001E23FFEA5518 134 0.996 4016.5 31 AR4000 000000006B61C7575523 150 0.997 3588.0 31 AR40000 000000002F63A7755533 223 0.998 2413.5 31 AR400000 000000003363BFF25534 321 0.999 1676.7 31 AR4000000 000000002E60EDFC5537 573 0.999 939.3 31 BOTH40 000000006D419FED5513 125 0.995 4305.6 31 BOTH400 000000001E23FFEA5518 160 0.996 3363.8 31 BOTH4000 000000006B61C7575523 182 0.997 2957.2 31 BOTH40000 000000002F63A7755533 229 0.998 2350.2 31 BOTH400000 000000003363BFF2 5534 379 0.999 1420.1 31 BOTH4000000 000000002E60EDFC 5537 1311 0.999 410.5

  18. Quality versus Iterations

  19. Speedup versus Quality

  20. Analysis For small problems, exact algorithm Heuristic achieves > 90% quality Random approximation can reach 100% Loosely Correlated – use heuristic Tightly Correlated – use random

  21. Research Answers • How does the effort of writing a parallel approximate algorithm for the problem compare to the effort of writing an exact sequential algorithm for the problem? • Exact algorithm was a simple concept • Easier to code - ~3 hours • Random algorithm requires more thought • More to code, debug - ~5 hours

  22. Research Answers • For various sizes of the problem, what happens to the parallel approximate algorithm's solution quality as the number of repetitions increases? • More iterations  higher quality solutions • More random starting combinations • Increase chance one hits higher maxima • Occasionally increasing iterationsdoes not increase quality

  23. Research Answers • For various sizes of the problem, how much faster than the exact sequential algorithm is the parallel approximate algorithm as a function of solution quality? • Higher quality solutions  lower speedup • More iterations are being executed • Decrease in speedup is slight • Exact solution can be found in fraction oftime

  24. Future Work • Allow more variables • Compare heuristic versus random • GPU Cuda implementation • Add/remove would not be a prime candidate • Multi-constraint • Weight capacity and volume

  25. Lessons Learned • Getting ahead of schedule is good • Reduces pressure of deadlines • Allows time for exploring other areas • New bitwise and serializing techniques • Use bit pattern as included/excluded • Externalizeable • Enjoyable work feels more like play • Actually wanted to sit down and code • Helped me choose my cluster and thesis

  26. References [1] Dake. “Knapsack Problem.” Wikipedia, The Free Encyclopedia. Wikimedia Foundation, Inc. 7 August 2006. Web. 14 May 2012 [2] El Baz, M.E.D.; , "Load balancing in a parallel dynamic programming multi-method applied to the 0-1 knapsack problem," 2006 14thEuromicro International Conference on Parallel, Distributed, and Network-Based Processing. Feb. 2006 [3] Wei Shen; BeibeiXu; Jiang-ping Huang; , "An Improved Genetic Algorithm for 0-1 Knapsack Problems," 2011 Second International Conference on Networking and Distributed Computing (ICNDC). Sept. 2011 [4] SartajSahni. “Approximate Algorithms for the 0/1 Knapsack Problem.” J. ACM 22, 1 (January 1975). [5] Montgomery, John. “Tackling the Travelling Salesman Problem: Simulated Annealing.” Pyschic Origami. 28 June 2007. Web. 14 May 2012

  27. Questions? Thank you

More Related