810 likes | 1.01k Views
Introduction to Network Mathematics (1) - Optimization techniques. Yuedong Xu 10/08/2012. Purpose. Many networking/system problems boil down to optimization problems Bandwidth allocation ISP traffic engineering Route selection Cache placement Server sleep/wakeup
E N D
Introduction to Network Mathematics (1)- Optimization techniques YuedongXu 10/08/2012
Purpose • Many networking/system problems boil down to optimization problems • Bandwidth allocation • ISP traffic engineering • Route selection • Cache placement • Server sleep/wakeup • Wireless channel assignment • … and so on (numerous)
Purpose • Optimization as a tool • shedding light on how good we can achieve • guiding design of distributed algorithms (other than pure heuristics) • providing a bottom-up approach to reverse-engineer existing systems
Outline • Toy Examples • Overview • Convex Optimization • Linear Programming • Linear Integer Programming • Summary
Toys • Toy 1: Find x to minimize f(x) := x2 . . x*=0 if no restriction on x x*=2 if 2≤x≤4
Toys • Toy 2: Find the minimum minimize subject to Optimal solution ?
Toys • Toy 3: Find global minimum local minimum global minimum
Outline • Toy Examples • Overview • Convex Optimization • Linear Programming • Linear Integer Programming
Overview Ingredients! • Objective Function • A function to be minimized or maximized • Unknowns or Variables • Affect value of objective function • Constraints • Restrict unknowns to take on certain values but exclude others
Overview • Formulation: Minimize f0(x) Objective function subject to fi(x) ≤ 0; i=1,…,m hi(x) = 0; i=1,…,p Inequality constraint Equality constraint x: Decision variables
Overview • Optimization tree
Overview • Our coverage Nonlinear Programs Convex Programs Linear Programs (Polynomial) Integer Programs (NP-Hard) Flow and Matching
Outline • Toy Examples • Overview • Convex Optimization • Linear Programming • Linear Integer Programming
A convex combination of x, y. A strictconvex combination of x, y if 0, 1. Convex Optimization • Concepts • Convex combination: x, y Rn z = x +(1)y 0 1
Convex Optimization • Concepts • Convex set: S Rn is convex if it contains all convex combinations of pairs x, y S. convex nonconvex
Convex Optimization • Concepts • Convex set: more complicated case The intersection of any number of convex sets is convex.
a convex set a convex function if c x y Convex Optimization • Concepts • Convex function:(大陆教材定义与国外相反!) S Rn c:S R c(x +(1)y) c(x) + (1)c(y), 0 1 c(x) + (1)c(y) c(y) c(x) c(x +(1)y) x +(1)y
Convex Optimization • Concepts – convex functions • Exponential: eax is convex on R • Powers: xa is convex on R+ when a≥1 or • a≤0, and concave for 0≤a≤1 • Logarithm: log x is concave on R+ • Jensen’s inequality: • if f() is convex, then f(E[x]) <= E[f(x)] • You can also check it by taking 2nd order derivatives We are lucky to find that many networking problems have convex obj.s
Convex Optimization • Concepts • Convex Optimization: • A fundamental property: local optimum = global optimum Minimize f0(x) Convex subject to fi(x) ≤ 0; i=1,…,m hi(x) = 0; i=1,…,p Convex Linear/Affine
Convex Optimization • Method • You may have used • Gradient method • Newton method to find optima where constraints are real explicit ranges (e.g. 0≤x≤10, 0≤y≤20, -∞≤z≤∞ ……). • Today, we are talking about more generalized constrained optimization
Convex Optimization • How to solve a constrained optimization problem? • Enumeration? Maybe for small, finite feasible set • Use constraints to reduce number of variables? Works occasionally • Lagrange multiplier method – a general method (harder to understand)
Convex Optimization • Example: Minimize x2+ y2 • subject to x + y = 1 • Lagrange multiplier method: • Change problem to an unconstrained problem: • L(x, y, p) = x2 + y2 + p(1-x-y) • Think of p as “price”, (1-x-y) as amount of “violation” of the constraint • Minimize L(x,y,p) over all x and y, keeping p fixed • Obtain x*(p), y*(p) • Then choose p to make sure constraint met • Magically, x*(p*) and y*(p*) is the solution to the original problem!
Convex Optimization Example: Minimize x2 + y2 subject to x + y = 1 Lagrangian: L(x, y, p) = x2 + y2 + p(1-x-y) • Setting dL/dx and dL/dy to 0, we get x = y = p/2 • Since x+y=1, we get p=1 • Get the same solution by substituting y=1-x
Convex Optimization • General case: • minimize f0(x) • subject to fi(x) ≤ 0, i = 1, …, m • (Ignore equality constraints for now) • Optimal value denoted by f* • Lagrangian: • L(x, p) = f0(x) + p1f1(x) + … + pmfm(x) • Define • g(p) = infx (f0(x) + p1f1(x) + … + pmfm(x)) • If pi≥0 for all i, and x feasible, then g(p) ≤ f*
Convex Optimization • Revisit earlier example • L(x,p) = x2 + y2 + p(1-x-y) • x* = y* = p/2 • g(p) = p(1-p/2) • This is a concave function, with g(p*) = 1/2 • We know f* is 1/2, and g(p) is a lower bound for f* with different values of p – the tightest lower bound occurs at p=p*.
Convex Optimization • Duality • For each p, g(p) gives a lower bound for f* • We want to find as tight a lower bound as possible: • maximize g(p) • subject to p≥0 • a) This is called the (Lagrange) “dual” problem, original problem the “primal” problem • b) Let the optimal value of dual problem be denoted d*. • We always have d* ≤ f* (called “weak duality”) • c) If d* = f*, then we have “strong duality”. The difference (f*-d*) is called the “duality gap”
Convex Optimization • Price Interpretation • Solving the constrained problem is equivalent to obeying hard resource limits • Imaging the resource limits can be violated; you can pay a marginal price per unit amount of violation (or receive an amount if the limit not met) • When duality gap is nonzero, you can benefit from the 2nd scenario, even if the price are set in unfavorable terms
Convex Optimization • Duality in algorithms • An iterative algorithm produces at iteration j • A primal feasible x(j) • A dual feasible p(j) • With f0(x(j))-g(p(j)) -> 0 as j -> infinity • The optimal solution f* is in the interval [g(p(j)), f0(x(j))]
Convex Optimization • Complementary Slackness • To make duality gap zero, we need to have • pi fi(x) = 0 for all i • This means • pi > 0 => fi(x*) = 0 • fi(x*) < 0 => pi = 0 • If “price” is positive, then the corresponding constraint is limiting • If a constraint is not active, then the “price” must be zero
Convex Optimization • KKT Optimality Conditions • Satisfy primal and dual constraints • fi(x*) ≤0, pi*≥0 • Satisfy complementary slackness • pi* fi(x*) = 0 • Stationary at optimal • f0’(x*) + Σipi* fi’(x*) = 0 • If primal problem is convex and KKT conditions met, then x* and p* are dual optimal with zero duality gap • KKT = Karush-Kuhn-Tucker
Convex Optimization • Method • Take home messages • Convexity is important (non-convex problems are very difficult) • Primal problem is not solved directly due to complicated constraints • Using Lagrangian dual approach to obtain dual function • KKT condition is used to guarantee the strong duality for convex problem
Convex Optimization • Application: TCP Flow Control • What’s in your mind about TCP? • Socket programming? • Sliding window? • AIMD congestion control? • Retransmission? • Anything else?
Convex Optimization • Application: TCP Flow Control • Our questions: • It seems that TCP works as Internet scales. Why? • How does TCP allocate bandwidth to flows traversing the same bottlenecks? • Is TCP optimal in terms of resource allocation?
Convex Optimization • Application: TCP Flow Control • Network • Linkslof capacities cl • Sourcesi • L(s) - links used by source i(routing) • Ui(xi) - utility if source rate = xi • i). The larger rate xi, the more happiness; • ii). The increment of happiness is shrinking as xiincreases.
x1 c1 c2 x2 x3 Convex Optimization • Application: TCP Flow Control • A simple network
Convex Optimization • Application: TCP Flow Control • Primal problem
Convex Optimization • Application: TCP Flow Control • Lagrangian dual problem You can solve it using KKT conditions! You need whole information!
Convex Optimization • Application: TCP Flow Control • Question: Is distributed flow control possible without knowledge of the network?
Convex Optimization • Application: TCP Flow Control • Primal-dual approach: • Source updates sending rate xi • Link generates congestion signal • Source’s action: • If congestion severe, then reduce xi • If no congestion, then increase xi • Congestion measures: • Packet loss (TCP Reno) • RTT (TCP Vegas)
Convex Optimization • Application: TCP Flow Control • Gradient based Primal-dual algorithm: Source updates rate given congestion signal Link updates congestion signal Total prices of a flow along its route Price of congestion at a link: packet drop prob., etc.
Convex Optimization • Application: TCP Flow Control • Relation to real TCP
Convex Optimization • Application: Fairness concerns • Many concepts regarding fairness • Max-min, proportional fairness, etc. • (add some examples to explain the concepts)
Convex Optimization • Application: Fairness concerns • How does fairness relate to optimization? Reflected by utility function • Esp. max-min fairness as alpha infinity
Convex Optimization • Application: TCP Flow Control Take home messages: • TCP is solving a distributed optimization problem! • Primal-dual algorithm can converge! • Packet drop, and RTT are carrying “price of congestion”! • Fairness reflected by utility function
Outline • Toy Examples • Overview • Convex Optimization • Linear Programming • Linear Integer Programming • Summary
Linear Programming • A subset of convex programming • Goal: maximize or minimize a linear function on a domain defined by linear inequalities and equations • Special properties
Linear Programming • Formulation • General form Max or min Subject to ……
A set of linear constraints • linear inequalities or • linear equations Linear Programming • Formulation • General form A linear objective function Max or min Subject to ……
Linear Programming • Formulation • Canonical form “greater than” only Min subject to Non-negative
8 7 6 5 4 3 2 1 1 2 3 4 5 6 7 8 Linear Programming • Example y Maximize Subject to x