Preference elicitation
This presentation is the property of its rightful owner.
Sponsored Links
1 / 41

Preference elicitation PowerPoint PPT Presentation


  • 70 Views
  • Uploaded on
  • Presentation posted in: General

Preference elicitation. Communicational Burden by Nisan, Segal, Lahaie and Parkes. October 27th, 2004. Jella Pfeiffer. Outline. Motivation Communication Lindahl prices Communication complexity Preference Classes Applying Learning Algorithms to Preference elicitation Applications

Download Presentation

Preference elicitation

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Preference elicitation

Preference elicitation

Communicational Burden

by Nisan, Segal, Lahaie and Parkes

October 27th, 2004

Jella Pfeiffer


Outline

Outline

  • Motivation

  • Communication

  • Lindahl prices

  • Communication complexity

  • Preference Classes

  • Applying Learning Algorithms to Preference elicitation

  • Applications

  • Conclusion

  • Future Work


Outline1

Outline

  • Motivation

  • Communication

  • Lindahl prices

  • Communication complexity

  • Preference Classes

  • Applying Learning Algorithms to Preference elicitation

  • Applications

  • Conclusion

  • Future Work


Motivation

Motivation

  • Exponential number of bundles in the number of goods

    • Communication of values

    • Determination of valuations

  • Reluctance to reveal valuation entirely

    minimzecommunication and informationrevelation*

    * Incentives are not considered


Agenda

Agenda

  • Motivation

  • Communication

    • Burden

    • Protocols

  • Lindahl prices

  • Communication complexity

  • Preference Classes

  • Applying Learning Algorithms to Preference elicitation

  • Applications

  • Conclusion

  • Future Work


Communication burden

?

Communication burden

Communication burden:

  • Minimum Number of messages

  • Transmitted in a protocol (nondeterministic)

  • Realizing the communication

    Here: „worst-case“ burden = max. number


Communication protocols

Communication protocols

Sequential message sending

  • Deterministic protocol:

    Message send, determined by type and preceding messages

  • Nondeterministic protocol: Omniscient oracle

    • Knows state of the world ≽ and

    • Desirable alternative x ∈ F(≽)


Definition nondeterministic protocol

Definition Nondeterministic protocol

A nondeterministic protocol is a triple Г = (M, μ, h) where M is the message set, μ: R  M is the message correspondance, and h: MX‘ is the outcome function, and the message correspondance μ has the following two properties:

  • Existence: μ(≽) ≠ ∅ for all ≽ ∈ ℜ,

  • Privacy preservation: μ(≽) = ∩i μi(≽i) for all ≽ ∈ ℜ, where μi: Ri  M for all i ∈ N.


Agenda1

Agenda

  • Motivation

  • Communication

  • Lindahl prices

    • Equilibria

    • Importance of Lindahl prices

  • Communication complexity

  • Preference Classes

  • Applying Learning Algorithms to Preference elicitation

  • Applications

  • Conclusion

  • Future Work


Lindahl equilbria

Lindahl Equilbria

Lindahl prices: nonlinear and non-anonymous

Definition: is a Lindahl equilibrium in state ≽ ∈ ℜ if

  • ≽i) for all i ∈ N, (L1)

  • (L2)

    Lindahl equilibrium correspondance: ↠


Importance of lindahl prices

Importance of Lindahl prices

Protocol <M, μ, h> realizes the weakly Pareto efficient correspondence F* if and only if there exists an assignment of budget sets to messages such that protocol <M, μ, (B,h)> realizes the Lindahl equilibrium correspondance E.

Communication burden of efficiency

=

burden of finding Lindahl prices

!

!


Agenda2

Agenda

  • Motivation

  • Communication

  • Lindahl prices

  • Communication complexity

    • Alice and Bob

    • Proof for Lower Bound

    • Communication complexity

  • Preference Classes

  • Applying Learning Algorithms to Preference elicitation

  • Applications

  • Conclusion

  • Future Work


Alice and bob

Alice and Bob


Communication complexity 1

Communication Complexity (1)

Finding a lower bound from „Alice and Bob“:

  • Including auctioneer

  • Larger number of bidders

  • Queries to the bidders

  • Communicating real numbers

  • Deterministic protocols


The proof

The proof

Lemma: Let v ≠u be arbitrary 0/1 valuations. Then, the sequence of bits transmitted on inputs (v,v*), is not identical to the sequence of bits transmitted on inputs (u,u*).

(v*(S) = 1-v(Sc))

Theorem: Every protocol that finds the optimal allocation for every pair of 0/1 valuations v1, v2 must use at least bits of total communication in the worst case.


Comments on the proof

Comments on the proof

  • In the main paper: Better allocation than auctioning off all objects as a bundle in a two-bidder auction needs at least

    Holds for valuations with:

    • No externalities

    • Normalization

  • With L = 50 items, the number of bits is

    (about 500 Gigabytes of data)


Communication complexity 2

Communication Complexity (2)

Theorem*: Exact efficiency requires communicating at least one price for each of the possible bundles. ( is the dimension of the message space)

*Holds for general valuations.


Agenda3

Agenda

  • Motivation

  • Communication

  • Lindahl prices

  • Communication complexity

  • Preference Classes

  • Applying Learning Algorithms to Preference elicitation

  • Applications

  • Conclusion

  • Future Work


Preference classes

Preference Classes

  • Submodular valuations:

    Dimension of message space in any efficient protocol is at least -1

  • Homogenous valuations:

    Agents care only about number of items recieved

    Dimension L

  • Additive Valuations

    Dimension L


Agenda4

Agenda

  • Motivation

  • Communication

  • Lindahl prices

  • Communication complexity

  • Preference Classes

  • Applying Learning Algorithms to Preference elicitation

    • Learning algorithms

    • Preference elicitation

    • Parallels (polynomial query learnable/elicitation)

    • Converting learning algorithms

  • Applications

  • Conclusion

  • Future Work


Applying learning algorithms

Membership Query

Equivalence Query

Value Query

Demand Query

Applying Learning Algorithms

Learning theory

Preference elicitation


What is a learning algorithm

What is a Learning Algorithm?

  • Learning an unknown function f: X  Y via questions to an oracle

  • Known function class C

  • Typically: , Y either {0,1} or ⊆ ℜ

  • Manifest hypotheses:

  • Size(f) with respect to presentation

    • Example: f: ;f(x) = 2 if x consists of m 1‘s, and f(x) = 0 otherwise.

      1) a list of values

      2)


Learning algorithm queries

Learning Algorithm - Queries

Membership

Query

Equivalence

Query


Preference elicitation1

Preference elicitation

Assumptions:

  • Normalized

  • No externalities

  • Quasi-linear utility function

  • Polynomial time for representation  values of bundles

    Goal:

    Sufficient set of manifest valuations to

    compute an optimal allocation.


Preference eliciation queries

Preference eliciation - Queries

Value

Query

Demand

Query


Parallels learning eliciation pref

Parallels: learning & eliciation pref.

  • Membership query Value query

  • Equivalence query ? Demand query

  • Lindahl prices are only a constant away from manifest valuations

  • Out of a preferred bundle S‘, counterexamples can be computed


Polynomial query learnable

Polynomial-query learnable

Defintion:The representation class C is polymonial-query exactly learnable from membership and equivalence queries if there is a fixed polynomial and an algorithm L with access to membership and equivalence queries of an oracle such that for any target function f ∈ C, L outputs after at most p(size(f),m) queries a function such that for all instances x.


Polynomial query elicited

Polynomial-query elicited

Similar to definition for polynomial-query

learnable but:

  • Value and demand queries

  • Agents‘ valuations are target functions

  • Outputs in p(size(v1,...,vn),m) an optimal allocation

  • Valuation functions need not to be determined exactly!


Converting learning algorithms

Converting learning algorithms

Idea proved in paper:

If each representation class V1,…,V2 can be polynomial-query exactly learned from membership and equivalence queries

 V1,…,V2 can be polynomial-query elicited from value and demand queries.


Converted algorithm

Converted Algorithm

1) Run learning algorithms on valuation classes until each requires response to equivalence query


Converted algorithm1

Converted Algorithm

  • Compute optimal allocation S* and Lindahl prices L* with respect to manifest valuations

  • Represent demand query with S* and L*


Converted algorithm2

Converted Algorithm

4) Quit if all agents answer YES, otherwise give counterexample from agent i to learning algorithm i. goto 1


Agenda5

Agenda

  • Motivation

  • Communication

  • Lindahl prices

  • Communication complexity

  • Preference Classes

  • Applying Learning Algorithms to Preference elicitation

  • Applications

    • Polynomial representation

    • XOR/DNF

    • Linear-Threshold

  • Conclusion

  • Future Work


Polynomials

Polynomials

  • T-spares, multivariate polynomials:

    • T-terms

    • Term is product of variables (e.g. x1x3x5)

  • „Every valuation function can be uniquely written as polynomial“ [Schapire and Selli]

    • Example: additive valuations

      • Polynomials of size m (m = number of items)

      • x1+…+xm

  • Learning algorithm:

    • At most Equivalence queries

    • At most Membership queries


Xor dnf representations 1

XOR/DNF Representations (1)

  • XOR bids represent valuations wich have free-disposal

  • Analog in learning theory: DNF formulae

    • Disjunction of conjunctions with unnegated bits

    • E.g.

    • Atomic bids in XOR have value 1


Xor dnf representations 2

XOR/DNF Representations (2)

  • An XOR bid containing t atomic bids can be exactly learned with t+1 equivalence queries and at most tm membership queries

    • Each Equivalence query leads to one new atomic bid

    • By m membership queries (exluding bids out of the counteraxample which do not belong to the atomic bid)


Linear threshold representations

Linear-Threshold Representations

  • r-of-S valuation

  • Let , r-of-k threshold functions:

  • If r known: equivalence queries or demand queries


Important results by nisan segal

Important Results by Nisan, Segal

  • Important role of prices (efficient allocation must reveal suppporting Lindahl prices)

  • Efficient communication must name at least one Lindahl price for each of the bundles

  • Lower bound:

    no generell good communication design

    focus on specific classes of preferences


Important results by lahaie parkes

Important Results by Lahaie, Parkes

  • Learning algorithm with membership and equivalence queries as basis for preference elicitation algorithm

  • If polynomial-query learnable algorithm exists for valuations, preferences can be efficiently elicited whith queries polynomial in m and size(v1,…,vn)

    solution exists for polynomials, XOR, linear-threshold


Future work

Future Work

  • Finding more specific classes of preferences which can be elicited efficiently

  • Address issue of incentives

  • Which Lindahl prices may be used for the questions


Thank you for your attenttion

Thank you for your attenttion

Any Questions?


  • Login