1 / 9

Outline of the Topics Covered in the Machine Learning Interface Course : (see full outline for more detail)

Outline of the Topics Covered in the Machine Learning Interface Course : (see full outline for more detail). Marc Sobel. Stat 9180: Topics for the interface between Statistics, Statistical Learning, Machine Learning, Data Mining, and Computer Vision

calvine
Download Presentation

Outline of the Topics Covered in the Machine Learning Interface Course : (see full outline for more detail)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Outline of the Topics Covered in the Machine Learning Interface Course : (see full outline for more detail) Marc Sobel

  2. Stat 9180: Topics for the interface between Statistics, Statistical Learning, • Machine Learning, Data Mining, and Computer Vision • Time: Monday Evenings: 7:15-9:25, Fall Semester 2007. • Place: Tuttleman 401B. • Course Number: Old=701; New=9180. • Instructor: Marc Sobel, Department of Statistics, Temple University. • Office: 338 Speakman Hall.

  3. Introduction • This course is designed to cover Bayesian and statistical learning topics relevant to the fields of Machine learning, Data Mining, and Computer Vision. Prerequisites for the course include a knowledge of lower level algebra and pre-calculus. Students must complete a semester project dealing with one or more of the area’s listed below for credit. Projects can be concerned with the statistical techniques themselves or with relevant applications. I will suggest possible projects throughout the course. The course will cover statistical techniques with applications including the following:

  4. Topics Discussed • 1. Clustering: the interface between k-means, EM based clustering, enhanced k-means clustering. • 2. Bayes Theorem: Occam’s Razor and the reason for avoiding classical statistics. The advantages of Bayes theorem. • 3. Markov Chain Monte Carlo in Computational Analysis. • 4. Boosting in statistics and machine learning

  5. Topics (continued) • 5. The role of ‘distance’ and ‘density’ in formulating statistical models. The special role of Kullback Leibler Divergence. • 6. Sequential Markov Chain Monte Carlo: Using Bayesian filters, particles to solve problems in inference. • 7. Robot Mapping and the alignment of maps

  6. Topics (More) • 8. Statistics and Shape Theory • 9. The use of robust statistical techniques for clustering and inference. • 10. Random Fields and Hidden Markov Models in applications. • 11. Additional Topics?

  7. Bibliography: (The titles in red are of particular interest/value for the course) • Bibliography • [1] Anderson, Ted, An introduction to Multivariate Statistical Analysis, Wiley-Interscience, 2003. • [2] Baldi, P., and Brunak, S. Bioinformatics: the machine learning approach, MIT Press. • [3] Carlin, B.P., and Louis, T.A., Bayes and empirical bayes methods for data analysis, Chapman and Hall, 1996. • [4] Cox, Trevor F., Multidimensional Scaling, Chapman and Hall, 2001. • [5] Doucet, A., Freitas N., Gordon, N. Sequential Monte Carlo Methods in Practice, Springer, 2001 • [6] Eaton, Morris, Multivariate Statistics: a vector space approach, Wiley, 1983. • [7] Frey, B. Graphical Models for Machine Learning and Digital Communication, MIT Press, 1998. • [8] Hardle, Wolfgang, Smoothing Techniques, Springer, 1990. • [9] Hardle, Wolfgang, Nonparametric and semiparametric models, Springer 2004.

  8. Bib (more) • [10] Hsu, Jason, Multiple Comparisons: Theory and Methods, Chapman and Hall, 1996. • [11] Huber, Peter Robust Statistical Procedures, SIAM, 1996. • [12a] Krim, H. and Yezzi A. Statistics and Shape Analysis • [12] Li, Stan Z. Markov Random Field Modeling in Image Analysis, Springer Computer Science Workbench, 2001. • [13] Liu, Jun S., Monte Carlo Strategies in Scientific Computing, Springer, 2001 • [14] Mackay, David Information Theory, Inference, and Learning Algorithms, Cambridge University Press, 2003. • [15] Neal, Radford, Bayesian Learning for Neural Networks, Springer, 1996. • [16] Rousseeuw, Peter W. Robust regression and outlier detection, Wiley-Interscience, 2003. • [17] Schmidli, Heinz, Reduced rank regression: with applications to quantitative structure-activity relationships, Physica-Verlag, 1995.

  9. Bib (more) • [18] Tanner, Martin, Tools for Statistical Inference; Methods for the exploration of Posterior Distributions and Likelihood Functions, Springer, 1996 • [19] Thrun, Sebastian, Burgard, and Fox, Probabilistic Robotics, • [20] Hastie, Tibshirani, and Friedman, The elements of Statistical Learning, Springer 2001. • [21] Timm, Neil H. Applied multivariate analysis, Springer 2006. • [22] Tapia, R., and Thompson, J.R., Nonparametric Density Estimation, Johns Hopkins, 1978. • [23] Vapnik, Vladimir, The nature of Statistical Learning, Springer, Second Edition, 2000. • [24] Weisberg, Sanford, Applied Linear Regression, Wiley, 1995. • [25] Wilcox, Rand R., Introduction to robust estimation and hypothesis testing, Academic Press, 1997. • [26] Winkler, Gerhard, Image Analysis, Random Fields, and Dynamic Monte Carlo Methods, A Mathematical Introduction, Springer 2003

More Related