1 / 30

Boosted Sampling: Approximation Algorithms for Stochastic Problems

Boosted Sampling: Approximation Algorithms for Stochastic Problems. Martin P ál Joint work with Anupam Gupta R. Ravi Amitabh Sinha. Infrastructure Design Problems. Build a solution Sol of minimal cost, so that every user is satisfied. minimize cost( Sol )

astra
Download Presentation

Boosted Sampling: Approximation Algorithms for Stochastic Problems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Boosted Sampling: Approximation Algorithms for Stochastic Problems Martin Pál Joint work with Anupam Gupta R. Ravi Amitabh Sinha Boosted Sampling

  2. Infrastructure Design Problems Build a solution Sol of minimal cost, so that every user is satisfied. minimize cost(Sol) subject to satisfied(j,Sol) for j=1, 2, …, n For example, Steiner tree: Sol: set of edges to build satisfied(j,Sol) iff there is a path from terminal j to root cost(Sol) = eSol ce Boosted Sampling

  3. Infrastructure Design Problems Assumption: Sol is a set of elements cost(Sol) = elemSolcost(elem) Facility location: satisfied(j) iff j connected to an open facility Vertex Cover: satisfied(e={uv}) iff u or v in the cover Steiner network: satisfied(j) iff j’s terminals connected by a path Cut problems: satisfied(j) iff j’s terminals disconnected Boosted Sampling

  4. Dealing with uncertainity Often, we do not know the exact requirements of users. Building in advance reduces cost – but we do not have enough information. As time progresses, we gain more information about the demands – but building under time pressure is costly. Tradeoff between information and cost. Boosted Sampling

  5. drawn from a known distribution π The model Two stage stochastic model with recourse: On Monday, elements are cheap, but we do not know how many/which clients will show up. We can buy some elements. On Tuesday, clients show up. Elements are now more expensive (by an inflation factorσ). We have to buy more elements to satisfy all clients. Boosted Sampling

  6. Want compact representation of Sol2 by an algorithm The model Two stage stochastic model with recourse: Find Sol1 Elems and Sol2 : 2Users 2Elems to minimize cost(Sol1) + σ Eπ(T)[cost(Sol2(T))] subject to satisfied(j, Sol1  Sol2(T)) for all sets TUsers and all jT Boosted Sampling

  7. Related work • Stochastic linear programming dates back to works of Dantzig, Beale in the mid-50’s • Only moderate progress on stochastic IP/MIP • Scheduling literature, various distributions of job lengths • Single stage stochastic: maybecast [Karger&Minkoff00], bursty connections [Kleinberg,Rabani&Tardos00]… • Stochastic versions of NP-hard problems (restricted π)[Ravi & Sinha 03], [Immorlica, Karger, Minkoff & Mirrokni04] • Extensive literature on each deterministic problem Boosted Sampling

  8. Our work • We propose a simple but powerful framework to find approximate solutions to two stage stochastic problems using approximation algorithms for their deterministic counterparts. • For a number of problems, including Steiner Tree, Facility Location, Single Sink Rent or Buy and Steiner Forest (weaker model) our framework gives constant approximation. • Analysis is based on strict cost sharing, developed by [Gupta,Kumar,P.&Roughgarden03] Boosted Sampling

  9. No restriction on distributions • Previous works often assume special distributions: • Scenario model: There are k sets of users – scenarios; each scenario Ti has probability pi. [Ravi & Sinha 03]. • Independent decisions model: each client j appears with prob. pjindependently of others [Immorlica et al04]. • In contrast, our scheme works for arbitrary distributions (although the independent coinflips model sometimes allows us to prove improved guarantees). Boosted Sampling

  10. Example: Steiner Tree The Framework Given an approx. algorithm Alg for a deterministic problem: 1. Boosted Sampling: Draw σ samples of clients S1,S2 ,…,Sσ from the distribution π. 2. Build the first stage solution Sol1: use Alg to build a solution for clients S = S1S2  … Sσ. 3. Actual set T of clients appears. To build second stage solution Sol2, use Alg to augmentSol1 to a feasible solution for T. Boosted Sampling

  11. Performance Guarantee Theorem:Let P be a sub-additive problem, with α-approximation algorithm, that admits β-strict cost sharing. Stochastic(P) has (α+β) approx.  Corollary: Stochastic Steiner Tree, Facility Location, Vertex Cover, Steiner Network (restricted model)… have constant factor approximation algorithms. Corollary: Deterministic and stochastic Rent-or-Buy versions of these problems have constant approximations. Boosted Sampling

  12. Pf: Let Sol = Opt1 [Opt2(S1) … Opt2(Sσ) ]. E[cost(Sol)]  cost(Opt1) + iEπ[cost(Opt2(Si))] = Z*. E[cost(Sol1)]  α E[cost(Sol)] (α-approximation). First Stage Cost Opt cost Z* = cost(Opt1) + σ Eπ[cost(Opt2(T))]. Recall: We - sample S1,S2 ,…,Sσ from π. - use Alg to build solution Sol1 feasible for S=i Si Lemma: E[cost(Sol1)]  αZ*. Boosted Sampling

  13. Second stage cost After Stage 2, have a solution for S’ = S1 …Sσ T. Let Sol’ = Opt1 [Opt2(S1)…Opt2(Sσ)Opt2(T)]. E[cost(Sol’)]  cost(Opt1) + (σ+1) Eπ[cost(Opt2(Si))]  (σ+1)/σ  Z*. T is “responsible” for 1/(σ+1) part of Sol’. If built in Stage 1, it would cost Z*/σ. Need to build it in Stage 2  pay Z*. Problem: do not T know when building a solution for S1 …Sσ. Boosted Sampling

  14. Idea: cost sharing Scenario 1: Pretend to build a solution for S’ = S T. Charge each jS’ some amount ξ(S’,j). Scenario 2: Build a solution Alg(S) for S. Augment Alg(S) to a valid solution for S’ = S T. Assume: jS’ξ(S’,j) Opt(S’) We argued: E[jTξ(S’,j)] Z*/σ (by symmetry) Want to prove: Augmenting cost in Scenario 2  β  jTξ(S’,j) Boosted Sampling

  15. Cost sharing function Input: Instance of P and set of users S’ Output: cost share ξ(S,j) of each user jS’ Example: Build a spanning tree on S’ root. Let ξ(S’,j) = cost of parental edge/2. Note: - 2jS’ξ(S’,j) = cost of MST(S’) - jS’ξ(S’,j) cost of Steiner(S’) Boosted Sampling

  16. What properties of ξ(,) do we need? (P1) Good approximation: cost(Alg(S)) Opt(S) (P2) Cost shares do not overpay: jSξ(S,j) cost(S) (P3) Strictness: For any S,TUsers: cost of Augment(Alg(S), T)  β  jTξ(S T, j) Second stage cost = σ cost(Augment(Alg(i Si), T))   σβ  jTξ(j SjT, j) E[jTξ(j SjT, j)]  Z*/σ Hence, E[second stage cost]  σβ  Z*/σ = β  Z*. Boosted Sampling

  17. Strictness for Steiner Tree Alg(S) = Min-cost spanning tree MST(S) ξ(S,j) =cost of parental edge/2 in MST(S) Augment(Alg(S), T): for all jT build its parental edge in MST(S T) Alg is a 2-approx for Steiner Tree ξis a 2-strict cost sharing function for Alg.Theorem: We have a 4-approx for Stochastic Steiner Tree. Boosted Sampling

  18. 2 2 2 1 2 2 2 1 2 3 3 1 2 3 4 1 2 2 2 1 1 1 1 1 3 2 3 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 Vertex Cover 8 Users: edges Solution: Set of vertices that covers all edges Edge {uv} covered if at least one of u,v picked. 4 3 10 9 5 3 Alg: Edges uniformly raise contributions Vertex can be paid for by neighboring edges  freeze all edges adjacent to it. Buy the vertex. Edges may be paying for both endpoints  2-approximation Natural cost shares: ξ(S, e) = contribution of e Boosted Sampling

  19. S = blue edges T = red edge • Alg(S) = blue vertices: • Augment(Alg(S), T) costs (n+1) • ξ(S T, T) =1 • Find a better ξ? Do not know how. Instead, make Alg(S) buy a center vertex.  gap Ω(n)! Strictness for Vertex Cover 1 1 1 1 n+1 n+1 1 1 n 1 1 1 1 Boosted Sampling

  20. Augmentation (at least in our example) is free. Making Alg strict Alg’: - Run Alg on the same input. - Buy all vertices that are at least 50% paid for. 1 1 1 1 n+1 n+1 1 1 n 1 1 1 1 ½ of each vertex paid for, each edge paying for two vertices  still a 4-approximation. Boosted Sampling

  21. Why should strictness hold? Alg’: - Run Alg on the same input. - Buy all vertices that are at least 50% paid for. α1 α1’ S = blue edges T = red edges v Alg(S T) α2 α2’ α3 • Suppose vertex v fully paid for in Alg(S T). • If jTαj’ ≥ ½ cost(v), then T can pay for ¼ of v in the augmentation step. • If jSαj≥ ½ cost(v), then v would be open in Alg(S). • (almost.. need to worry that Alg(S T) and Alg(S) behave differently.) Boosted Sampling

  22. Metric facility location Input: a set of facilities and a set of cities living in a metric space. Solution: Set of open facilities, a path from each city to an open facility. “Off the shelf” components: 3-approx. algorithm [Mettu&Plaxton00]. Turns out that cost sharing fn [P.&Tardos03] is 5.45 strict. Theorem: There is a 8.45-approx for stochastic FL. Boosted Sampling

  23. Steiner Network client j = pair of terminals sj, tj satisfied(j): sj, tj connected by a path 2-approximation algorithms known ([Agarwal,Klein&Ravi91], [Goemans&Williamson95]), but do not admit strict cost sharing. [Gupta,Kumar,P.,Roughgarden03]: 4-approx algorithm that admits 4-uni-strict cost sharing Theorem: 8-approx for Stochastic Steiner Network in the “independent coinflips” model. Boosted Sampling

  24. cost f(e): # paths using e cost # paths using e The Buy at Bulk problem client j = pair of terminals sj, tj Solution: an sj, tj path for j=1,…,n cost(e) = ce f(# paths using e) Rent or Buy: two pipes Rent: $1 per path Buy: $M, unlimited # of paths Boosted Sampling

  25. cost n/σ cost(e) = ce min(1, σ/n #paths using e) # paths using e Special distributions: Rent or Buy Stochastic Steiner Network: client j = pair of terminals sj, tj satisfied(j): sj, tj connected by a path Suppose.. π({j}) = 1/n π(S) = 0 if |S|1 Sol2({j}) is just a path! Boosted Sampling

  26. Rent or Buy The trick works for any problem P. (can solve Rent-or-Buy Vertex Cover,..) These techniques give the best approximation for Single-Sink Rent-or-Buy (3.55 approx [Gupta,Kumar,Roughgarden03]), and Multicommodity Rent or Buy (8-approx [Gupta,Kumar,P.,Roughgarden03], 6.83-approx [Becchetti, Konemann, Leonardi,P.04]). “Bootstrap” to stochastic Rent-or-Buy: - 6 approximation for Stochastic Single-Sink RoB - 12 approx for Stochastic Multicommodity RoB (indep. coinflips) Boosted Sampling

  27. What if σ is also stochastic? Suppose σ is also a random variable. π(S, σ) – joint distribution For i=1, 2, …, σmax do sample (Si, σi) from π with prob. σi/σmaxacceptSi Let S be the union of accepted Si’s Output Alg(S) as the first stage solution Boosted Sampling

  28. Multistage problems • Three stage stochastic Steiner Tree: • On Monday, edges cost 1. We only know the probability distribution π. • On Tuesday, results of a market survey come in. We gain some information I, and update π to the conditional distribution π|I. Edges cost σ1. • On Wednesday, clients finally show up. Edges now cost σ2 (σ2>σ1), and we must buy enough to connect all clients. • Theorem: There is a 6-approximation for three stage stochastic Steiner Tree (in general, 2k approximation for k stage problem) Boosted Sampling

  29. Conclusions • We have seen a randomized algorithm for a stochastic problem: using sampling to solve problems involving randomness. • Do we need strict cost sharing? Our proof requires strictness – maybe there is a weaker property? Maybe we can prove guarantees for arbitrary subadditive problems? • Prove strictness for Steiner Forest – so far we have only uni-strictness. • Cut problems: Can we say anything about Multicut? Single-source multicut? Boosted Sampling

  30. Find Sol1 Elems and Sol2 : 2Users 2Elems to minimize cost(Sol1) + σ Eπ(T)[cost(Sol2(T))] subject to satisfied(j, Sol1  Sol2(T)) for all sets TUsers and all jT +++THE++END+++ Note that if π consists of a small number of scenarios, this can be transformed to a deterministic problem. . Boosted Sampling

More Related