1 / 13

Learning Voting Trees

Learning Voting Trees. Ariel D. Procaccia, Aviv Zohar, Yoni Peleg, Jeffrey S. Rosenschein. Lecture Outline. Voting. PAC. Results. Limitations. Voting and Voting Trees PAC learning Results Limitations. Voting . Voting. PAC. Results. Limitations.

brit
Download Presentation

Learning Voting Trees

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Learning Voting Trees Ariel D. Procaccia, Aviv Zohar, Yoni Peleg, Jeffrey S. Rosenschein

  2. Lecture Outline Voting PAC Results Limitations • Voting and Voting Trees • PAC learning • Results • Limitations

  3. Voting Voting PAC Results Limitations • Election: set of voters N={1,...,n}, candidates C={a,b,c,...} • Voters have linear preferences. • Winner of the election determined according to a voting rule F. • Plurality: each voter gives 1 point to first place. • Copeland: • x1 beats x2 in a pairwise election if most voters prefer x1 to x2. • Candidate’s score is num of other candidates beaten in pairwise election.

  4. Tournaments Voting PAC Results Limitations • A tournament over C is complete binary irreflexive relationship over C. • Summarizes results of pairwise elections. • Example: N={1,2,3}, C={a,b,c} • Voter 1: c > b > a • Voter 2: b > a > c • Voter 3: a > c > b • Overall: a < b, b < c, c < a (Condorcet paradox). • (Pairwise) voting rule is a function from tournaments to candidates.

  5. Voting Trees Voting PAC Results Limitations a < b, b < c, c < a ? ? ? c a c ? b a

  6. Voting Trees Voting PAC Results Limitations • Voting trees are everywhere! • Concise representation of (pairwise) voting rules. • In gen., double exponential number  Exponential representation. • Capture many rules, such as Copeland. • Given some (pairwise) voting rule, want to find accurate (as much as possible), concise representation by voting tree. • Idea: use learning. Designer labels tournaments with winners, learning algorithm outputs a “good” voting tree.

  7. PAC Learning Voting PAC Results Limitations • Want to learn voting rule f (not necessarily tree). • Training set consists of example pairs (Tj,f(Tj)). • Tj – tournaments drawn from fixed dist. D. • err(h)=PrD[h(T)  f(T)]. • f* minimizes err(h) over all voting trees. • Goal: given , find voting tree g such that err(g)  err(f*)+. • Q: How many examples are needed in order to guarantee that goal is achieved with prob. at least 1-?

  8. Formulation of Theorems Voting PAC Results Limitations • Theorem: An exponential training set is needed to learn voting trees. • Restrict to the class of voting trees of polynomial size (at most k leaves). • Lemma: If the size of this class is (only) exponential, the following alg achieves the goal with polynomial training set: return the tree which minimizes mistakes on the training set. • Theorem: |Voting trees with  k leaves|  exp(m,k) • Proof: • size  (# possible structures# assignments to leaves) k = (# possible structures  mk)  k

  9. Number of tree structures Voting PAC Results Limitations

  10. Approximation by Voting Trees Voting PAC Results Limitations • Voting rule g is a c-approximation of f iff f and g agree on a c-fraction of the tournaments. • Theorem: Most voting rules can’t be approximated by small voting trees to a factor of better than ½. • This result isn’t as negative as it sounds.

  11. Closing Remarks Voting PAC Results Limitations • Computational learning theory as a method to concisely represent voting rules. • Other concisely representable families: Scoring rules • Defined by a vector 1,...,m • Efficiently PAC learnable • Which voting rules can be approximated? Under which underlying distributions?

  12. Encore: Computational Complexity Voting PAC Results Limitations • So far were interested in sample complexity. • Recall: If the size of this class is (only) exponential, the following alg achieves the goal with polynomial training set: return the tree which minimizes mistakes on the training set. • Theorem: Finding such a tree is NP-hard. • In practice, the complexity depends on the structure of the tree.

  13. A Graph!! Voting PAC Results Limitations

More Related