1 / 42

Novel algorithms for peer-to-peer optimization in networked systems

Novel algorithms for peer-to-peer optimization in networked systems. Bj örn Johansson and Mikael Johansson, Automatic Control Lab, KTH, Stockholm, Sweden. Joint work with M. Rabi, C. Caretti, T. Keviczky and K.-H. Johansson. Content. Motivation Decomposition review

claude
Download Presentation

Novel algorithms for peer-to-peer optimization in networked systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Novel algorithms for peer-to-peer optimization in networked systems Björn Johansson and Mikael Johansson, Automatic Control Lab, KTH, Stockholm, Sweden Joint work with M. Rabi, C. Caretti, T. Keviczky and K.-H. Johansson

  2. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Content • Motivation • Decomposition review • A framework for peer-to-peer optimization • Markov-randomized incremental subgradient method • Combined consensus-subgradient method • Experiences from implementation • Conclusions

  3. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Motivation Large-scale optimization problem… Decomposed into several small subproblems • Potentially large computational savings • Foundation for distributed decision-making • fi performance of agent i, depends on action of others • challenge: avoid coordinator, obey communication constraints Coordinator

  4. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Application: multi-agent coordination Find jointly optimal controls and rendez-vous point ”DMPC” – Distributed model-predictive consensus.

  5. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Application: distributed estimation Node v measures yv, cooperates to find network-wide estimate Solution is average, algorithm solves ”consensus” problem • Directly extends to Huber’s M-function (robust estimator) Insert ”physical” pictureof estimation network here

  6. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Application: resource allocation Throughput maximization under global bandwidth constraint Global constraint, not global variable complicates problem. Insert ”physical” pictureof estimation network here

  7. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Content • Motivation • Decomposition review • A framework for peer-to-peer optimization • Markov-randomized incremental subgradient method • Combined consensus-subgradient method • Experiences from implementation • Conclusions

  8. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Decomposition review Techniques for decomposing large-scale problem into many small Coordinator

  9. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Trivial case: separable problems Separable problems Each node v can find xv by itself, no coordinator needed. • Reality often more complex (and interesting!) Coordinator

  10. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Complicating variables Consider unconstrained problem in variables (x1, x2, ): Here,  is complicating (or coupling) variable. Observation: when fixed, problem is separable in (x1, x2) • how can this be exploited?

  11. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Primal decomposition Fix complicating variable , define To evaluate functions i we need to solve associated subproblems. Original problem is equivalent to the master problem in variable . Convex when original problem is. Possibly non-smooth. Called primal decomposition • master problem (coordinator) optimizes primal variable.

  12. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Dual decomposition Introduce new variables 1, 2 and consider Here, 1 and 2 are local versions of complicating variable  The constraints 1=2 enforces consistency. Key observation: Lagrangian is separable (can minimize over local variables separately)

  13. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Dual decomposition Hence, the dual function has the form where each part of the dual can be evaluated locally, (evaluation requires solving dual subproblems) Dual problem is convex, but not necessarily differentiable.

  14. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Subgradient methods A subgradient of a convex function f at x is any  that satisfies • affine global underestimators • coincide with gradient if f smooth Projected subgradient method Converge if  bounded and

  15. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Incremental subgradient methods Apply to problems on the form (e.g. our general form, by letting ) Algorithm: (v,k subgradient of fv at k) Update  by cyclic componentwise (negative) subgradient steps • can use fixed (e.g. 1…V) or random update order

  16. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Content • Motivation • Decomposition review • A framework for peer-to-peer optimization • Markov-randomized incremental subgradient method • Combined consensus-subgradient method • Experiences from implementation • Conclusions

  17. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Our framework A convex (possibly non-smooth) optimization problem A connected communication graph • local variables xv at each node v • global variables  • per-node loss function fv(xv, ) Peer-to-peer: • Nodes can only communicate with neighbors

  18. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Quiz and challenge Quiz: Which of the techniques we described are peer-to-peer? • Primal decomposition? • Dual decomposition? • Incremental subgradient methods? Challenge: develop simple and efficient p2p optimization techniques!

  19. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Content • Motivation • Decomposition review • A framework for peer-to-peer optimization • Markov-randomized incremental subgradient method • Combined consensus-subgradient method • Experiences from implementation • Conclusions

  20. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Peer-to-peer incremental subgradients? Incremental subgradients not peer-to-peer • Estimate of optimizer forwarded in ring, or to arbitrary node Is it possible to develop method that only forwards to neighbors?

  21. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Unbiased random walk on graph Need to construct “unbiased” random walk • Visit every node with equal probability(has stationary uniform probability) • Transition matrix can be computed via Metropolis-Hastings (dv is the degree of node v, i.e. number of links) • Can be computed using local info only!

  22. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Markov-randomized algorithm Repeat: • Update estimate(vk state of Markov chain, vk subgradient of fvk at k) • Pass estimate to random neighbor using Markov chainP=[Pv,w] computed via Metropolis-Hasting Conceptually simple idea. What can we say about its properties?

  23. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Main result Proof highlights: • Sample sequence when chain in state v • Establish: all nodes visited w. equal probability during return time • Use conditional expectations • Invoke supermartingale theorem

  24. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Example: robust estimation

  25. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Content • Motivation • Decomposition review • A framework for peer-to-peer optimization • Markov-randomized incremental subgradient method • Combined consensus-subgradient method • Experiences from implementation • Conclusions

  26. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Consensus-subgradient method Key trick for distributing dual decomposition Dual decomposition: relax consistency requirements Alternative idea: “neglect and project” • Each node has local view of global decision variables • Updates in direction of (negative) subgradient • Coordinate with neighbors to achieve consistency • Will apply consensus iterations

  27. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Basic algorithm Repeat • Predict next iterate using subgradient method (v subgradient of f at v(k)) • Execute I consensus iterations to approach consistency • Project (locally) on constraint set

  28. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Main result (unconstrained case) Proof: based on results from approximate subgradient methods Similar, somewhat more complex, results for constrained case.

  29. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Example Simple 5-node network (left) non-smooth functions fv (right)

  30. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Example Iterates for one (left) and 11 consensus iterations per step

  31. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se To think about… What is the right aggregation primitive in the network? • Sampling via unbiased random walk? • Consensus/gossiping? • Spanning-trees? Has implication on • Implementation complexity/accuracy • Privacy (internal models, objectives private or shared?) • Information dissemination (who knows what in the end)

  32. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Content • Motivation • Decomposition review • A framework for peer-to-peer optimization • Markov-randomized incremental subgradient method • Combined consensus-subgradient method • Experiences from implementation • Conclusions

  33. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Implementation experiences Wireless sensor network testbed at KTH The ultimate test: • can we make these algorithms run on our WSN nodes?

  34. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Wireless communication Sensors communicate using 802.15.4 compliant radios Basic primitives: • Unicast: a node addresses a single neighbor at a time • Broadcast: communication with(possibly) all neighbors Exist in reliable and unreliable versions

  35. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Problem and solution candidates We considered quadratic loss functions in nodes • consensus iterations one way to find optimum Implemented three alternatives • P2P incremental subgradient, using reliable unicast • Dual decomposition using unreliable broadcast • Gossiping algorithm by Boyd et al, reliable broadcast

  36. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Algorithm I: dual Nodes maintain local estimate of optimizer • Broadcasts current iterate to neighors • Updates Lagrange multipliers for some links(based on disagreement with neigbors) • Updates local estimate Unreliable broadcast, since algorithm can tolerate some packet losses [Rabbat et al, IEEE SPAWC 2005]

  37. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Algorithm II: consensus iteration The classical consensus iteration • Broadcasts current iterate to neighors • Updates local estimate Reliable broadcast for consistency [Xiao et al, IPSN 2005]

  38. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Algorithm III: p2p incremental Our peer-to-peer incremental subgradient method • Update estimate using subgradientwith respect to local loss function • Pass estimate to random neigbour(forwarding decision based on Metropolis) Reliable unicast (important not to loose token)

  39. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Ns2 simulations fv quadratic (consensus), NS2 evaluation of three schemes Dual, Markov-incremental subgradient, Xiao-Boyd.

  40. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Real implementation

  41. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Experiences • Works surprisingly well • Basic primitives not so basic • Reliable broadcast • Neighbor discovery • Challenging the model • Link assymetry! • Packet loss, • Time/energy-efficiency. Need to go back and revise theory (and implementation!)

  42. ACCESS Group meeting Mikael Johansson mikaelj@ee.kth.se Conclusions Distributed optimization in networked systems • Important and useful • Many challenges remain! Novel peer-to-peer optimization algorithms • Markov-modulated incremental subgradient method • Consensus-subgradient Practical implementation in WSN testbed Implementation and application challenges drive next iteration!

More Related