410 likes | 583 Views
Preference elicitation. Communicational Burden by Nisan, Segal, Lahaie and Parkes. October 27th, 2004. Jella Pfeiffer. Outline. Motivation Communication Lindahl prices Communication complexity Preference Classes Applying Learning Algorithms to Preference elicitation Applications
E N D
Preference elicitation Communicational Burden by Nisan, Segal, Lahaie and Parkes October 27th, 2004 Jella Pfeiffer
Outline • Motivation • Communication • Lindahl prices • Communication complexity • Preference Classes • Applying Learning Algorithms to Preference elicitation • Applications • Conclusion • Future Work
Outline • Motivation • Communication • Lindahl prices • Communication complexity • Preference Classes • Applying Learning Algorithms to Preference elicitation • Applications • Conclusion • Future Work
Motivation • Exponential number of bundles in the number of goods • Communication of values • Determination of valuations • Reluctance to reveal valuation entirely minimzecommunication and informationrevelation* * Incentives are not considered
Agenda • Motivation • Communication • Burden • Protocols • Lindahl prices • Communication complexity • Preference Classes • Applying Learning Algorithms to Preference elicitation • Applications • Conclusion • Future Work
? Communication burden Communication burden: • Minimum Number of messages • Transmitted in a protocol (nondeterministic) • Realizing the communication Here: „worst-case“ burden = max. number
Communication protocols Sequential message sending • Deterministic protocol: Message send, determined by type and preceding messages • Nondeterministic protocol: Omniscient oracle • Knows state of the world ≽ and • Desirable alternative x ∈ F(≽)
Definition Nondeterministic protocol A nondeterministic protocol is a triple Г = (M, μ, h) where M is the message set, μ: R M is the message correspondance, and h: MX‘ is the outcome function, and the message correspondance μ has the following two properties: • Existence: μ(≽) ≠ ∅ for all ≽ ∈ ℜ, • Privacy preservation: μ(≽) = ∩i μi(≽i) for all ≽ ∈ ℜ, where μi: Ri M for all i ∈ N.
Agenda • Motivation • Communication • Lindahl prices • Equilibria • Importance of Lindahl prices • Communication complexity • Preference Classes • Applying Learning Algorithms to Preference elicitation • Applications • Conclusion • Future Work
Lindahl Equilbria Lindahl prices: nonlinear and non-anonymous Definition: is a Lindahl equilibrium in state ≽ ∈ ℜ if • ≽i) for all i ∈ N, (L1) • (L2) Lindahl equilibrium correspondance: ↠
Importance of Lindahl prices Protocol <M, μ, h> realizes the weakly Pareto efficient correspondence F* if and only if there exists an assignment of budget sets to messages such that protocol <M, μ, (B,h)> realizes the Lindahl equilibrium correspondance E. Communication burden of efficiency = burden of finding Lindahl prices ! !
Agenda • Motivation • Communication • Lindahl prices • Communication complexity • Alice and Bob • Proof for Lower Bound • Communication complexity • Preference Classes • Applying Learning Algorithms to Preference elicitation • Applications • Conclusion • Future Work
Communication Complexity (1) Finding a lower bound from „Alice and Bob“: • Including auctioneer • Larger number of bidders • Queries to the bidders • Communicating real numbers • Deterministic protocols
The proof Lemma: Let v ≠u be arbitrary 0/1 valuations. Then, the sequence of bits transmitted on inputs (v,v*), is not identical to the sequence of bits transmitted on inputs (u,u*). (v*(S) = 1-v(Sc)) Theorem: Every protocol that finds the optimal allocation for every pair of 0/1 valuations v1, v2 must use at least bits of total communication in the worst case.
Comments on the proof • In the main paper: Better allocation than auctioning off all objects as a bundle in a two-bidder auction needs at least Holds for valuations with: • No externalities • Normalization • With L = 50 items, the number of bits is (about 500 Gigabytes of data)
Communication Complexity (2) Theorem*: Exact efficiency requires communicating at least one price for each of the possible bundles. ( is the dimension of the message space) *Holds for general valuations.
Agenda • Motivation • Communication • Lindahl prices • Communication complexity • Preference Classes • Applying Learning Algorithms to Preference elicitation • Applications • Conclusion • Future Work
Preference Classes • Submodular valuations: Dimension of message space in any efficient protocol is at least -1 • Homogenous valuations: Agents care only about number of items recieved Dimension L • Additive Valuations Dimension L
Agenda • Motivation • Communication • Lindahl prices • Communication complexity • Preference Classes • Applying Learning Algorithms to Preference elicitation • Learning algorithms • Preference elicitation • Parallels (polynomial query learnable/elicitation) • Converting learning algorithms • Applications • Conclusion • Future Work
Membership Query Equivalence Query Value Query Demand Query Applying Learning Algorithms Learning theory Preference elicitation
What is a Learning Algorithm? • Learning an unknown function f: X Y via questions to an oracle • Known function class C • Typically: , Y either {0,1} or ⊆ ℜ • Manifest hypotheses: • Size(f) with respect to presentation • Example: f: ;f(x) = 2 if x consists of m 1‘s, and f(x) = 0 otherwise. 1) a list of values 2)
Learning Algorithm - Queries Membership Query Equivalence Query
Preference elicitation Assumptions: • Normalized • No externalities • Quasi-linear utility function • Polynomial time for representation values of bundles Goal: Sufficient set of manifest valuations to compute an optimal allocation.
Preference eliciation - Queries Value Query Demand Query
Parallels: learning & eliciation pref. • Membership query Value query • Equivalence query ? Demand query • Lindahl prices are only a constant away from manifest valuations • Out of a preferred bundle S‘, counterexamples can be computed
Polynomial-query learnable Defintion:The representation class C is polymonial-query exactly learnable from membership and equivalence queries if there is a fixed polynomial and an algorithm L with access to membership and equivalence queries of an oracle such that for any target function f ∈ C, L outputs after at most p(size(f),m) queries a function such that for all instances x.
Polynomial-query elicited Similar to definition for polynomial-query learnable but: • Value and demand queries • Agents‘ valuations are target functions • Outputs in p(size(v1,...,vn),m) an optimal allocation • Valuation functions need not to be determined exactly!
Converting learning algorithms Idea proved in paper: If each representation class V1,…,V2 can be polynomial-query exactly learned from membership and equivalence queries V1,…,V2 can be polynomial-query elicited from value and demand queries.
Converted Algorithm 1) Run learning algorithms on valuation classes until each requires response to equivalence query
Converted Algorithm • Compute optimal allocation S* and Lindahl prices L* with respect to manifest valuations • Represent demand query with S* and L*
Converted Algorithm 4) Quit if all agents answer YES, otherwise give counterexample from agent i to learning algorithm i. goto 1
Agenda • Motivation • Communication • Lindahl prices • Communication complexity • Preference Classes • Applying Learning Algorithms to Preference elicitation • Applications • Polynomial representation • XOR/DNF • Linear-Threshold • Conclusion • Future Work
Polynomials • T-spares, multivariate polynomials: • T-terms • Term is product of variables (e.g. x1x3x5) • „Every valuation function can be uniquely written as polynomial“ [Schapire and Selli] • Example: additive valuations • Polynomials of size m (m = number of items) • x1+…+xm • Learning algorithm: • At most Equivalence queries • At most Membership queries
XOR/DNF Representations (1) • XOR bids represent valuations wich have free-disposal • Analog in learning theory: DNF formulae • Disjunction of conjunctions with unnegated bits • E.g. • Atomic bids in XOR have value 1
XOR/DNF Representations (2) • An XOR bid containing t atomic bids can be exactly learned with t+1 equivalence queries and at most tm membership queries • Each Equivalence query leads to one new atomic bid • By m membership queries (exluding bids out of the counteraxample which do not belong to the atomic bid)
Linear-Threshold Representations • r-of-S valuation • Let , r-of-k threshold functions: • If r known: equivalence queries or demand queries
Important Results by Nisan, Segal • Important role of prices (efficient allocation must reveal suppporting Lindahl prices) • Efficient communication must name at least one Lindahl price for each of the bundles • Lower bound: no generell good communication design focus on specific classes of preferences
Important Results by Lahaie, Parkes • Learning algorithm with membership and equivalence queries as basis for preference elicitation algorithm • If polynomial-query learnable algorithm exists for valuations, preferences can be efficiently elicited whith queries polynomial in m and size(v1,…,vn) solution exists for polynomials, XOR, linear- threshold
Future Work • Finding more specific classes of preferences which can be elicited efficiently • Address issue of incentives • Which Lindahl prices may be used for the questions
Thank you for your attenttion Any Questions?