1 / 26

Introduction

Introduction. Lecture 1 What is structured prediction?. Our goal today. To define a Structure and Structured Prediction. What are structures?. What are some examples of structured data?. Database tables and spreadsheets HTML documents JSON objects Wikipedia info-boxes

Download Presentation

Introduction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction Lecture 1 What is structured prediction?

  2. Our goal today To define a Structure and Structured Prediction

  3. What are structures?

  4. What are some examples of structured data? • Database tables and spreadsheets • HTML documents • JSON objects • Wikipedia info-boxes • Computer programs … we will see more examples

  5. What are some examples of unstructured data? • Images and photographs • Videos • Text documents • PDF files • Books • … What makes these unstructured? How are they different from the previous list?

  6. Structured representations are useful because …. • We know how to process them • Algorithms for managing symbolic data • Computational complexity well understood • They abstract away unnecessary complexities • Why deal with text/images/etc when you can process a database with the same information?

  7. Have we made the problem easier? Water is split, providing a source of electrons and protons (hydrogen ions, H+) and giving off O2 as a by-product. Light absorbed by chlorophyll drives a transfer of the electrons and hydrogen ions from water to an acceptor called NADP+. What does the splitting of water lead to? A: Light absorption B: Transfer of ions

  8. Reading comprehension can be hard! Enable Water is split, providing a source of electrons and protons (hydrogen ions, H+) and giving off O2 as a by-product. Light absorbed by chlorophyll drives a transfer of the electrons and hydrogen ions from water to an acceptor called NADP+. Water is split, providing a source of electrons and protons (hydrogen ions, H+) and giving off O2 as a by-product. Light absorbed by chlorophyll drives a transfer of the electrons and hydrogen ions from water to an acceptor called NADP+. Cause What does the splitting of water lead to? A: Light absorption B: Transfer of ions To answer the question, we need a structured representation.

  9. Machine learning to the rescue • Techniques from statistical learning can help build these representations • In fact, machine learning is necessary to scale up and generalize this process

  10. A detour about classification

  11. Classification • We know how to train classifiers • Given an email, spam or not spam? • Is a review positive or negative? • Which folder should an email be automatically be placed into? • “Predict if a car purchased at an auction is a lemon” • And other such questions from kaggle

  12. Standard classification setting • Notation • X: A feature representation of input • Y: One of a set of labels (spam, not-spam) • The goal: To learn a function X ! Y that maps examples into a category • The standard recipe • Collect labeled examples {(x1,y1), (x2, y2), } • Train a function f: X ! Y that • Is consistent with the observed examples, and • Can hopefully be correct on new unseen examples Based on slides of Dan Roth

  13. Classification is generally well understood • Theoretically: generalization bounds • We know how many examples one needs to see to guarantee good behavior on unseen examples • Algorithmically: good learning algorithms for linear representations • Efficient and can deal with high dimensionality (millions of features) • Some open questions • What is a good feature representation? • Learning protocols: how to minimize supervision, efficient semi-supervised learning, active learning Is this sufficient for solving problems like the reading comprehension one? No! Based on slides of Dan Roth

  14. Examples where standard classification is not enough

  15. Semantic Role Labeling X: John saw the dog chasing the ball. Y: Or equivalently, predicate-argument representations See(Viewer: John, Viewed: the dog chasing the ball) Chase(Chaser: the dog, Chased: the ball)

  16. Semantic Parsing X: “A python function that takes a name and prints the string Hello followed by the name and exits.” Y: X: “Find the largest state in the US.” Y: SELECT name FROMus_states WHEREsize = (SELECT MAX(size) FROMus_states) In all these cases, the output Y is a structure

  17. What is a structure? One definition By … linguistic structure, we refer to symbolic representations of language posited by some theory of language. From the book Linguistic Structure Prediction, by Noah Smith, 2011.

  18. What is in this picture? Photo by Andrew Dressel - Own work. Licensed under Creative Commons Attribution-Share Alike 3.0

  19. Object detection handle bar saddle/seat Right facing bicycle right wheel left wheel Photo by Andrew Dressel - Own work. Licensed under Creative Commons Attribution-Share Alike 3.0

  20. The output: A schematic showing the parts and their relative layout handle bar saddle/seat Right facing bicycle right wheel left wheel Once again, a structure

  21. A working definition of a structure A structure is a concept that can be applied to any complex thing, whether it be a bicycle, a commercial company, or a carbon molecule. By complex, we mean: • It is divisible into parts, • There are different kinds of parts, • The parts are arranged in a specifiable way, and, • Each part has a specifiable function in the structure of the thing as a whole From the book Analysing Sentences: An Introduction to English Syntax by Noel Burton-Roberts, 1986.

  22. What is structured prediction?

  23. Standard classification tools can’t predict structures X: “Find the largest state in the US.” Y: Classification is about making one decision • Spam or not spam, or predict one label, etc We need to make multiple decisions • Each part needs a label • Should “US” be mapped to us_states or utah_counties? • Should “Find” be mapped to SELECT or FROM or WHERE? • The decisions interact with each other • If the outer FROM clause talks about the table us_states, then the inner FROM clause should not talk about utah_counties • How to compose the fragments together to create the whole structure? • Should the output consist of a WHERE clause? What should go in it? SELECT name FROMus_states WHEREsize = (SELECT MAX(size) FROMus_states)

  24. Structured prediction: Machine learning of interdependent variables • Unlike standard classification problems, many problems have structured outputs with • Multiple interdependent output variables • Both local and global decisions to be made • Mutual dependencies necessitate a joint assignment to all the output variables • Joint inference or Global inference or simply Inference • These problems are called structured output problems

  25. Computational issues Model definition What are the parts of the output? What are the inter-dependencies? Dataannotation difficulty How to trainthe model? Background knowledge about domain How to do inference? Semi-supervised/indirectly supervised?

  26. Another look at the important issues • Availability of supervision • Supervised algorithms are well studied; supervision is hard (or expensive) to obtain • Complexity of model • More complex models encode complex dependencies between parts; complex models make learning and inference harder • Features • Most of the time we will assume that we have a good feature set to model our problem. But do we? • Domain knowledge • Incorporating background knowledge into learning and inference in a mathematically sound way

More Related