Cs 175 fall 2007 padhraic smyth department of computer science university of california irvine
Download
1 / 14

Project Feedback - PowerPoint PPT Presentation


  • 141 Views
  • Updated On :

CS 175, Fall 2007 Padhraic Smyth Department of Computer Science University of California, Irvine. Project Feedback . Timeline. Progress report and demo script Completed and graded Individual discussions/consultations today Thursday Dec 6 th : Student Presentations:

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Project Feedback' - dean


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Cs 175 fall 2007 padhraic smyth department of computer science university of california irvine l.jpg

CS 175, Fall 2007

Padhraic Smyth

Department of Computer Science

University of California, Irvine

Project Feedback


Timeline l.jpg
Timeline

  • Progress report and demo script

    • Completed and graded

    • Individual discussions/consultations today

  • Thursday Dec 6th: Student Presentations:

    • About 4 minutes per student, + questions

    • Format will be discussed this Thursday

  • Wednesday Dec 12th: Final project reports due


Today l.jpg
Today

  • Return of graded progress reports

  • Brief discussion/feedback (with slides) on progress reports

  • Discussion with each individual student on their progress so far


Progress reports l.jpg
Progress Reports

  • Maximum of 20 points

  • Mean, median ~ 14 points

    • Several scores in 15 to 18 range

    • Some scores of 10 or lower

      • Need to pay serious attention to your project

  • Writing generally better than for proposals

    • Many people still not using figures!




General comments on reports l.jpg
General Comments on Reports

  • Feel free to re-use text/figures from proposal or progress report in your final report

  • Compare your algorithm with simple baselines

    • E.g., is it performing better than random guessing?

  • If problem seems too hard (accuracy low, too slow, etc) try “backing off” to a simpler problem, e.g.,:

    • use perceptron or kNN instead of ANN

    • look at 2 class problem instead of 4 or more classes

    • remove problematic individuals/images (particularly in training)

    • Etc

  • Beware of “shirt-matching” in individual recognition

  • Use figures!!


Project feedback templates l.jpg
Project Feedback: Templates

  • Problems with speed of template matching:

    • Template size = m2

    • Image size = n2

    • Template matching = O(m2 n2)

    • E.g., m = 1000, n = 100, we have 1010 operations

  • Options?

    • Consider reducing the scale of both the template and the image

      • E.g., reduction in x and y by factor of 2 will give 16x speedup

    • Could consider using “sparse” matching

      • Template = m x m: only match to every kth pixel, e.g.,

        m = 64, match to every 4th or 8th pixel

    • Use the template_match.m function provided on the class Web site to see if its faster than your own implementation


Project feedback more on templates l.jpg
Project Feedback: More on Templates

  • Using average images as templates:

    • Good idea?

    • Should be compared to using individual images


Project feedback classification accuracy l.jpg
Project Feedback: Classification Accuracy

  • Always compare/interpret your results relative to a baseline

  • Accuracy of random guessing

    • If there are m classes, accuracy will be 1/m

    • E.g., 4 classes, accuracy of random guessing will be 25% on average

  • Accuracy of picking the most likely class

    • If classes are not equally likely, then picking the most likely class in the training data will have accuracy = probability of most likely class

    • E.g., 2 classes, p(c1) = 0.8, p(c2) = 0.2. Always picking c1 will have accuracy of 0.8

    • Same as random guessing if classes are equally likely

  • Compare with simple classifiers

    • If you are using a complicated classifier (like an ANN), you should compare to a simpler classifier (perceptron, kNN, min distance)


Project feedback classification results l.jpg
Project Feedback: Classification Results

  • Basic metric = cross-validated classification accuracy

    • But there are other things you can report as well

  • “Confusion matrix”

    • M classes

    • Table with M rows and M columns, 1 per class

    • Rows = true class labels, columns = predicted class labels

    • Entry(i,j) = number of times true class i was predicted as class j


Project feedback classification results12 l.jpg
Project Feedback: Classification Results

  • Example of “Confusion matrix”

  • Perfect classification -> no off-diagonal entries

  • Patterns of errors can help in diagnosing systematic errors in the classifier

  • See also “receiver operating characteristic” (good entry in Wikipedia)

    • Good for evaluating systems that have adjustable thresholds

    • Illustrates trade-off between true-detections and false alarms

Predicted Class

True Class


Project feedback using thresholds l.jpg
Project Feedback: Using Thresholds

  • Many projects are using thresholds in their algorithms

    • E.g., threshold on distance in template-matching

  • You should report on how sensitive your system is to the specific threshold value your system is using

    • Vary the threshold (increase/decrease by 10%, 20%) and generate a table of results for different threshold values

      • Does accuracy change much as the threshold changes?

  • How would your system select a threshold for a new set of images?

    • Manually?

    • Could you automate the threshold selection process?

      • E.g., use cross-validation to generate results over a range of possible threshold values and pick the one that performs best


Timeline14 l.jpg
Timeline

  • Progress report and demo script

    • Completed and graded

    • Individual discussions/consultations today

  • Thursday Dec 6th: Student Presentations:

    • About 4 minutes per student, + questions

    • Format will be discussed this Thursday

  • Wednesday Dec 12th: Final project reports due