1 / 23

CSC 4510 – Machine Learning

CSC 4510 – Machine Learning. 7: Neural Networks – Exercises with AISpace. Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University Course website: www .csc.villanova.edu /~map/4510/. The slides in this presentation are adapted from:

jamal
Download Presentation

CSC 4510 – Machine Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSC 4510 – Machine Learning 7: Neural Networks – Exercises with AISpace Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University Course website: www.csc.villanova.edu/~map/4510/ Theslides in this presentation are adapted from: Prof Paula Matuszek’s Knowledge Based Systems class at Villanova http://www.csc.villanova.edu/~matuszek/spring2011/index.html CSC 4510 - M.A. Papalaskari - Villanova University • Some of the slides in this presentation are adapted from: • Prof. Frank Klassner’s ML class at Villanova • the University of Manchester ML course http://www.cs.manchester.ac.uk/ugt/COMP24111/ • The Stanford online ML course http://www.ml-class.org/ • Some of the slides in this presentation are adapted from: • Prof. Frank Klassner’s ML class at Villanova • the University of Manchester ML course http://www.cs.manchester.ac.uk/ugt/COMP24111/ • The Stanford online ML course http://www.ml-class.org/ • Some of the slides in this presentation are adapted from: • Prof. Frank Klassner’s ML class at Villanova • the University of Manchester ML course http://www.cs.manchester.ac.uk/ugt/COMP24111/ • The Stanford online ML course http://www.ml-class.org/ • Some of the slides in this presentation are adapted from: • Prof. Frank Klassner’s ML class at Villanova • the University of Manchester ML course http://www.cs.manchester.ac.uk/ugt/COMP24111/ • The Stanford online ML course http://www.ml-class.org/ • Some of the slides in this presentation are adapted from: • Prof. Frank Klassner’s ML class at Villanova • the University of Manchester ML course http://www.cs.manchester.ac.uk/ugt/COMP24111/ • The Stanford online ML course http://www.ml-class.org/ • Some of the slides in this presentation are adapted from: • Prof. Frank Klassner’s ML class at Villanova • the University of Manchester ML course http://www.cs.manchester.ac.uk/ugt/COMP24111/ • The Stanford online ML course http://www.ml-class.org/ • Some of the slides in this presentation are adapted from: • Prof. Frank Klassner’s ML class at Villanova • the University of Manchester ML course http://www.cs.manchester.ac.uk/ugt/COMP24111/ • The Stanford online ML course http://www.ml-class.org/ • Some of the slides in this presentation are adapted from: • Prof. Frank Klassner’s ML class at Villanova • the University of Manchester ML course http://www.cs.manchester.ac.uk/ugt/COMP24111/ • The Stanford online ML course http://www.ml-class.org/

  2. Neural Nets tools in AI Space • Sample Data and Graph: Mail • Examples • Set properties • Initialize the parameters • Solve • How do we use it? Calculate output CSC 4510 - M.A. Papalaskari - Villanova University

  3. A small example: “Which class to take” • Inputs? • Outputs? • Sample data CSC 4510 - M.A. Papalaskari - Villanova University

  4. Some Examples • Example 1: • 3 inputs, 1 output, all binary • Example 2: • same inputs, output inverted CSC 4510 - M.A. Papalaskari - Villanova University

  5. Getting the right inputs • Example 3 • Same inputs as 1 and 2 • Same output as 1 • Outcomes reversed for half the cases CSC 4510 - M.A. Papalaskari - Villanova University

  6. Getting the right inputs • Example 3 • Same inputs as 1 and 2 • Same output as 1 • Outcomes reversed for half the cases • Network is not converging • The output here cannot be predicted from these inputs. • Whatever is determining whether to take the class, we haven’t captured it CSC 4510 - M.A. Papalaskari - Villanova University

  7. Representing non-numeric values • Example 4 • Required is represented as “yes” and “no” CSC 4510 - M.A. Papalaskari - Villanova University

  8. Representing non-numeric values • Example 4 • Required is represented as “yes” and “no” • Actual model still uses 1 and 0; transformation is done by applet. CSC 4510 - M.A. Papalaskari - Villanova University

  9. More non-numeric values • Example 5 • Workload is low, med, high: text values but they can be ordered. CSC 4510 - M.A. Papalaskari - Villanova University

  10. More non-numeric values • Example 5 • Workload is low, med, high: text values but they can be ordered. • Applet asks us to assign values. 1, 0.5, 0 is typical. CSC 4510 - M.A. Papalaskari - Villanova University

  11. Unordered values • Example 6 • Input variables here include professor • Non-numeric, can’t be ordered. CSC 4510 - M.A. Papalaskari - Villanova University

  12. Unordered values • Example 6 • Input variables here include professor • Non-numeric, can’t be ordered. • Still need numeric values • Solution is to treat n possible values as n separate binary values • Again, applet does this for us CSC 4510 - M.A. Papalaskari - Villanova University

  13. Variables with more values • Example 7 • GPA and number of classes taken are integer values • Takes considerably longer to solve • Looks for a while like it’s not converging CSC 4510 - M.A. Papalaskari - Villanova University

  14. And Reals • Example 8 • GPA is a real. • Takes about 20,000 steps to converge CSC 4510 - M.A. Papalaskari - Villanova University

  15. And multiple outputs • Small Car database from AIspace • For any given input case, you will get a value for each possible outcome. • Typical for, for instance, character recognition. CSC 4510 - M.A. Papalaskari - Villanova University

  16. Training and Test Cases • The basic training approach will fit the training data as closely as possible. • But we really want something that will generalize to other cases • This is why we have test cases. • The training cases are used to compute the weights • The test cases tell us how well they generalize • Both training and test cases should represent the overall population as well as possible. CSC 4510 - M.A. Papalaskari - Villanova University

  17. Representative Training Cases • Example 9 • Training cases and test cases are similar CSC 4510 - M.A. Papalaskari - Villanova University

  18. Representative Training Cases • Example 9 • Training cases and test cases are similar • (actually identical...) • Training error and test error are comparable CSC 4510 - M.A. Papalaskari - Villanova University

  19. Non-Representative Training Cases • Example 10 • Training cases and test cases represent different circumstances • We’ve missed including any cases involving Lee in the training • Training error goes down, but test error goes up. • In reality these are probably bad training AND test cases; neither seems representative. CSC 4510 - M.A. Papalaskari - Villanova University

  20. So: • Getting a good ANN still involves understanding your domain and capturing knowledge about it • choosing the right inputs and outputs • choosing representative training and test sets • Beware “convenient” training sets • You can represent any kind of variable: numeric or not, ordered or not. • Not every set of variables and training cases will produce something that can be trained. CSC 4510 - M.A. Papalaskari - Villanova University

  21. Once it’s trained... • When your ANN is trained, you can feed it a specific set of inputs and get one or more outputs. • These outputs are typically interpreted as some decision: • take the class • this is probably a “5” • The network itself is black box. • If the situation changes the ANN should be retrained • new variables • new values for some variables • new patterns of cases CSC 4510 - M.A. Papalaskari - Villanova University

  22. One last note • These have all been simple cases, as examples • Most of these examples could in fact be predicted much more easily and cleanly with a decision tree (or even a couple of IFstatements!) • A more typical use for any neural net system has many more inputs and many more training cases CSC 4510 - M.A. Papalaskari - Villanova University

  23. And one last example: • Our class data example (apples or oranges?) • Homework: Use what you learned today with this example, write up your comments and include some screenshots showing your work. • Due Tuesday 3/20/12 CSC 4510 - M.A. Papalaskari - Villanova University

More Related