1 / 21

FIE 2003, Boulder, Nov 2003 Behrouz Minaei-Bidgoli, Deborah A. Kashy,

Predicting Student Performance: An Application of Data Mining Methods with an Educational Web-based System. FIE 2003, Boulder, Nov 2003 Behrouz Minaei-Bidgoli, Deborah A. Kashy, Gerd Kortemeyer, William F. Punch Michigan State University. Topics. Problem Overview Classification Methods

Download Presentation

FIE 2003, Boulder, Nov 2003 Behrouz Minaei-Bidgoli, Deborah A. Kashy,

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Predicting Student Performance: An Application of Data Mining Methods with an Educational Web-based System FIE 2003, Boulder, Nov 2003 Behrouz Minaei-Bidgoli, Deborah A. Kashy, Gerd Kortemeyer, William F. Punch Michigan State University

  2. Topics • Problem Overview • Classification Methods • Combination of Classifiers • Weighting the features • Using GA to choose best set of weights • Experimental Results • Conclusion • Contribution • Next Steps FIE 2003

  3. Problem Overview • This research is apart of the latest online educational system developed at Michigan State University (MSU), the Learning Online Network with Computer-Assisted Personalized Approach (LON-CAPA). • In LON-CAPA, we are involved with two kinds of large data sets: • Educational resources: web pages, demonstrations, simulations, individualized problems, quizzes, and examinations. • Information about users who create, modify, assess, or use these resources. • Find classes of students. Groups of studentsuse these online resources in a similar way. • Predict for any individual student to which class he/she belongs. FIE 2003

  4. Data Set: PHY183 SS02 • 227 students • 12 Homework sets • 184 Problems FIE 2003

  5. Class Labels (3-ways) 2-Classes 3-Classes 9-Classes FIE 2003

  6. Extracted Features • Total number of correct answers. (Success rate) • Success at the first try • Number of attempts to get answer • Time spent until correct • Total time spent on the problem • Participating in the communication mechanisms FIE 2003

  7. Classifiers • Non-Tree Classifiers (Using MATLAB) • Bayesian Classifier • 1NN • kNN • Multi-Layer Perceptron • Parzen Window • Combination of Multiple Classifiers (CMC) • Genetic Algorithm (GA) • Decision Tree-Based Software • C5.0 (RuleQuest <<C4.5<<ID3) • CART (Salford-systems) • QUEST (Univ. of Wisconsin) • CRUISE [usean unbiased variable selection technique] FIE 2003

  8. GA Optimizer vs. Classifier • Apply GA directly as a classifier • Use GA as an optimization tool for resetting the parameters in other classifiers. • Most application of GA in pattern recognition applies GA as an optimizer for some parameters in the classification process. • GA is applied to find an optimal set of feature weights that improve classification accuracy. FIE 2003

  9. Fitness/Evaluation Function • 5 classifiers: • Multi-Layer Perceptron 2 Minutes • Bayesian Classifier • 1NN • kNN • Parzen Window • CMC 3 seconds • Divide data into training and test sets (10-fold Cross-Val) • Fitness function: performance achieved by classifier FIE 2003

  10. Individual Representation • The GA Toolbox supports binary, integer and floating-point chromosome representations. • Chrom = crtrp(NIND, FieldDR) creates a random real-valued matrix of N x d, NIND specifies the number of individuals in the population • FieldDR = [ 0 0 0 0 0 0; % lower bound • 1 1 1 1 1 1]; % upper bound • Chrom = 0.23 0.17 0.95 0.38 0.06 0.26 • 0.35 0.09 0.43 0.64 0.20 0.54 • 0.50 0.10 0.09 0.65 0.68 0.46 • 0.21 0.29 0.89 0.48 0.63 0.89 FIE 2003

  11. Simple GA % Create population Chrom = crtrp(NIND, FieldD); % Evaluate initial population, ObjV = ObjFn2(data, Chrom); gen = 0; while gen < MAXGEN, % Assign fitness-value to entire population FitnV = ranking(ObjV); % Select individuals for breeding SelCh = select(SEL_F, Chrom, FitnV, GGAP); % Recombine selected individuals (crossover) SelCh = recombin(XOV_F, SelCh,XOVR); % Perform mutation on offspring SelCh = mutate(MUT_F, SelCh, FieldD, MUTR); % Evaluate offspring, call objective function ObjVSel = ObjFn2(data, SelCh); % Reinsert offspring into current population [Chrom ObjV]=reins(Chrom,SelCh,1,1,ObjV,ObjVSel); gen = gen+1; MUTR = MUTR+0.001; end FIE 2003

  12. Results without GA FIE 2003

  13. Results of using GA FIE 2003

  14. FIE 2003

  15. GA Optimization Results FIE 2003

  16. GA Optimization Results FIE 2003

  17. GA Optimization Results Feature Extraction FIE 2003

  18. Summary • Four classifiers used to segregate the students. • A combination of multiple classifiers led to a significant accuracy improvement in all three cases (2-, 3- and 9-Classes). • Weighting the features and using a genetic algorithm to minimize the error rate improves the prediction accuracy by at least 10% in the all cases. FIE 2003

  19. Conclusion • In the case of the number of features is low, the feature weighting is working better than feature selection. • The successful implementations of using CMC to classify the students and successful optimization through Genetic Algorithm demonstrate the merits of using the LON-CAPA data for pattern recognition in order to predict the students’ final grades based on their features, which are extracted from the homework data. FIE 2003

  20. Contribution • Anew approach to evaluating student usage of web-based instruction • Easily adaptable to different types of courses, different population sizes, and different attributes to be analyzed • This method may be of considerable usefulness in • identifying students at risk early, especially in very large classes, • Allow the instructor to provide appropriate advising in a timely manner. FIE 2003

  21. Next Steps • Apply data mining methods to find the Frequency Dependency and Association Rules among the groups of problems (Mathematical, Optional Response, Numerical, Java Applet, etc) • Data mining help instructors, problem authors, and course coordinators better design online materials • find sequences of online problems that students take to solve homework problems • help instructors to develop their homework more effectively and efficiently • help to detect anomalies in homework problems designed by instructors FIE 2003

More Related