1 / 19

Learning and Testing Submodular Functions

Learning and Testing Submodular Functions. Columbia University. + Work in progress with Rocco Servedio. With Sofya Raskhodnikova (SODA’13). Grigory Yaroslavtsev. October 26, 2012. Submodularity. Discrete analog of convexity/concavity, law of diminishing returns

netis
Download Presentation

Learning and Testing Submodular Functions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Learning and Testing Submodular Functions Columbia University + Work in progress with Rocco Servedio With SofyaRaskhodnikova (SODA’13) GrigoryYaroslavtsev October 26, 2012

  2. Submodularity • Discrete analog of convexity/concavity, law of diminishing returns • Applications: optimization, algorithmic game theory Let • Discrete derivative: • Submodular function:

  3. Exact learning • Q: Reconstruct a submodular with poly(|X|) queries (for all arguments)? • A: Only-approximation (multiplicative) possible[Goemans, Harvey, Iwata, Mirrokni, SODA’09] • Q: O-fraction of points (PAC-style learning with membership queries under uniform distribution)? • A: Almost as hard [Balcan, Harvey, STOC’11].

  4. Approximate learning • PMAC-learning (Multiplicative), with poly(|X|) queries : (over arbitrary distributions [BH’11]) • PAAC-learning (Additive) • Running time: [Gupta, Hardt, Roth, Ullman, STOC’11] • Running time: poly[Cheraghchi, Klivans, Kothari, Lee, SODA’12]

  5. Learning • For all algorithms

  6. Learning: Bigger picture Subadditive [Badanidiyuru, Dobzinski, Fu, Kleinberg, Nisan, Roughgarden,SODA’12] XOS = Fractionally subadditive • Other positive results: • Learning valuation functions [Balcan, Constantin, Iwata, Wang, COLT’12] • PMAC-learning (sketching) valuation functions [BDFKNR’12] • PMAC learning Lipschitzsubmodular functions [BH’10] (concentration around average via Talagrand) Submodular Gross substitutes OXS Additive (linear) Value demand

  7. Discrete convexity • Monotone convex • Convex

  8. Discrete submodularity • Case study: = 1 (Boolean submodular functions ) • Monotone submodular = (monomial) • Submodular = (2-term CNF) • Monotone submodular • Submodular

  9. Discrete monotone submodularity • Monotone submodular

  10. Discrete monotone submodularity • Theorem: for monotonesubmodular f • (by monotonicity)

  11. Discrete monotone submodularity • Ssmallest subset of such that • we have => Restriction of on is monotone increasing =>

  12. Representation by a formula • Theorem: for monotonesubmodular f • Notation switch: , • (Monotone) (no negations) • (Monotone)submodular can be represented as a (monotone) pseudo-Boolean 2R-DNF with constants

  13. Discrete submodularity • Submodular can be represented as a pseudo-Boolean 2R-DNF with constants • Hint [Lovasz] (Submodularmonotonization): Given submodular define Then is monotone and submodular.

  14. Learning pB-formulas and k-DNF • = class of pB-DNF of width with • i-slice defined as • If its i-slices are -DNF and: • PAC-learning iff (

  15. Learning pB-formulas and k-DNF • Learn every i-sliceon fraction of arguments • Learning -DNF () (let Fourier sparsity = ) • Kushilevitz-Mansour (Goldreich-Levin): queries/time. • ``Attribute efficient learning’’: queries • Lower bound: () queries to learn a random -junta (-DNF) up to constant precision. • Optimizations: • Slightly better than KM/GL by looking at the Fourier spectrum of (see SODA paper: switching lemma => bound) • Do all R iterations of KM/GL in parallel by reusing queries

  16. Property testing • Let C be the class of submodular • How to (approximately) test, whether a given is in C? • Property tester: (Randomized) algorithm for distinguishing: • -far): • Key idea: -DNFs have small representations: • [Gopalan, Meka,Reingold CCC’12] (using quasi-sunflowers [Rossman’10]) ,-DNF formula F there exists: -DNF formula F’ of size such that

  17. Testing by implicit learning • Good approximation by juntas => efficient property testing [surveys: Ron; Servedio] • -approximation by -junta • Good dependence on : • [Blais, Onak] sunflowers for submodularfunctions • Query complexity: (independent of n) • Running time: exponential in (we think can be reduced it to O( • We have lower bound for testing -DNF (reduction from Gap Set Intersection: distinguishing a random -junta vs + O(log )-junta requires queries)

  18. Previous work on testing submodularity [Parnas, Ron, Rubinfeld ‘03, Seshadhri, Vondrak, ICS’11]: • U. • Lower bound: Special case: coverage functions [Chakrabarty, Huang, ICALP’12]. Gap in query complexity

  19. Thanks!

More Related