1 / 50

Team Difficulty, UI, and NQ

Team Difficulty, UI, and NQ. Mi Seon Park Caitlyn Seim David Fleischhauer Charles Ethan Hayes Weiqing Li. Introduction and Motivation. Wanted to make improvements to the system that would make it more useful for students as a learning tool

zalman
Download Presentation

Team Difficulty, UI, and NQ

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Team Difficulty, UI, and NQ Mi Seon Park Caitlyn Seim David Fleischhauer Charles Ethan Hayes Weiqing Li

  2. Introduction and Motivation • Wanted to make improvements to the system that would make it more useful for students as a learning tool • Mined data from previous semester was not being fully utilized • Improvements could be made to give the students and instructors more information, more options and an overall smarter system

  3. Objectives • Previous system lacked a concise, customizable presentation of question difficulty, skips, etc. for instructor view • Previous system only presented random questions • Instructors and students could only judge difficulty based on subjective rating • Using the mined difficulty and the student’s success at answering questions, the system could be automated to give the student a specific question, not just a random one

  4. Difficulty • Compiled data from Fall 2011 semester onward • This data was stored in the MinedData and MinedDataMC tables

  5. Difficulty • From data the key factors for difficulty are extracted: • Average Duration (default 40%) • Average Score (default 25%) • Number of Skips (default 25%) • Average Rating (default 10%) • The computed difficulty information is then stored in the questions_difficulty table.

  6. Questions_Difficulty

  7. Difficulty Improvements • Three ways are used to alternate the difficulty ratings to get a better spread: • Using the tags to select the max and min value • Capping the average skips and duration at one standard deviation above and below • Capping the average skips and duration at the Nthpercent.

  8. Regular Difficulty

  9. Difficulty 2

  10. Difficulty STD

  11. Difficulty Drop N (N=5%)

  12. The Overall Plot of difficulty ordered by the original

  13. Tools to analyze the difficulty • The stat php files that are stored in VIP/Fall2012 directory where there is a stat file for each type of difficulty that fetches the overview of that difficulty category. • Manually go into the sql database and copy the data into excel and plot. • The MATLAB file that is stored in the same directory which allows one to plot all the difficulty using either data from each of the chapters or using all the data points.

  14. Improvements to UI • Add different elements to UI that uses the mined data from previous semester and difficulty calculated this semester. • Upgrade student view to display difficulty information to the student • Upgrade instructor view to display different information about the question

  15. Improvements to UI • Student View: • classes/ITS_screen2.php • classes/ITS_rating.php • css/ITS.css • Instructor View: • Course.php • classes/ITS_statistics.php • js/ITS_course_jquery.php • ajax/ITS_admin.php

  16. Improvements to UI

  17. Improvements to UI

  18. Improvements to UI

  19. Old ITS System Random Question Skip Submit Score

  20. Created a new ITS_NQ class to handle the Next Question Selection

  21. Sample of a function in ITS_NQ Functions are commented and clean to make additions in the future easier

  22. Screen shot of the new proposed ITS system

  23. Database Entries • Each button has a unique database entry • Useful for future debugging/datamining • ITS_NQ operates based off of the button’s database entry for the most recent question

  24. Each database event correlates to a case in ITS_NQ

  25. Submit Button functionality • Regular Submit • User receives a harder or easier question based on whether their answer was correct (>50%) or incorrect • User-selected submits/skips • User receives a harder or easier question based on which type of question they requested

  26. Skip Button functionality • Forward/Back • The system keeps track of which questions you have seen and will let you go back and forth to previous questions • A question can only be inserted into the list once, so it must be restarted if you want to “smart skip” through the questions again • Easier/Harder • User receives a harder or easier question based on which type of question they requested

  27. New ITS System General View Question Skip Easier Skip Harder Question Score A Submit Button Harder Question Easier Question Submit Easier Submit Harder Smart Submit Score<=50 Score>50

  28. Skip button functionalities Question Question Direction Find last question FORWARD Go forward 1 question BACK empty Difficulty EASIER HARDER Get array of all questions that are easier Get array of all questions that are harder Get array from “Smart Skip” Randomize question from array

  29. Bucket/Increment System Center=difflast Span=.25 Center Span Span=Span+Span Submit or skip Result Empty, Min>0 or Max<10 Query DiffMin=Center-SpanL DiffMax=Center+SpanH Result Empty, Min=0 and Max=10 Result not empty User receives Question Assignment Finished

  30. “Smart” submit and skip functionality Answered question with submit Score<=50 Answered question with submit Score>50 Get last event info Center = round(difflast)+1 Span=.5 Smart Skip Center=round(difflast) Span=.5 Center = round(difflast)-1 Span=.5 No event Center = 5 Span=.5 Array of questions Diff=Center+-span Output next question Array is empty Randomize question from array NO Yes Restart list (forward/back) to let all of the unanswered questions be available Span = span + .5 Until) center+-span =(0,10) First empty event Yes NO

  31. Examples • User submits (harder) question with diff=6.2 • Next question comes from difficulty range 6.1-6.45, but this bucket is empty • Bucket expands, question should come from range 6.0-6.70. Bucket has at least one question, user receives a question • User smart submits a matching question (2 of 5 correct) with diff=4.8 • Next question comes from range 4.55-4.90

  32. Automating Question Selection • Using the mined difficulty and the student’s success at answering questions, the system is automated to give the student a custom-selected question, not just a random one • Created 2 schemes: • Scheme of One • History Scheme

  33. Groundwork:New Variables in Screen2.php

  34. Scheme of OneGetting the next question • For any Null scores (skips) • Default to 0 • Changed from default query to 2 query statements depending on conditionals (if’s): • if they scored >=50% on the last answered question (this accounts for Matching types etc.) • if they scored <50% on the last question

  35. Scheme of One Running out of Acceptable Questions • When user has answered all questions in the desired difficulty range • Query is made again without difficulty restrictions • Could have future applications with query in broader difficulty range (but History Scheme is more complex)

  36. Scheme of One Expanding • Expanded the queried history for the user • To include history of past three answers • This requires that at least 3 entries be in the database • Future work may work around this DB requirement • And grow to use a scheme of many past answers

  37. Scheme of One and History SchemeLogical Flow

  38. History Scheme:Cases • NULL, NULL, NULL • When there are three skips (any kind) in a row • This prevents skipping until a suitably easy question appears • Also alright if the user is skipping harder, this will give harder • Average of 3 is >=50 • Doing well • Harder questions will challenge the student • Average of 3 is <50 • Consider average of most recent questions

  39. History Scheme:Cases • Average of 2 is >=50 • Improving • Questions are harder, but not too hard (-1/+2 range) • Could be implemented as if (Average of most recent 2 > Average of 3 ~or~ Average of earliest 2) • Average of 2 is <50 • Not improving • Not only skipping, also trying • Easier questions will help the student

  40. Final Results • General UI improvements for Instructor view • A variety of difficulty scores for existing questions and the ability to dynamically update difficulties • Next Question selection was automated using scripts based upon difficulty values and student answers • Next Question selection can be automated or user-selected • The NQ algorithms and NQ/skip buttons enable a more intelligent tutoring system

  41. Future Work • More complex selection algorithms can be developed and tested based on a longer history of student usage • Student rating/grading can be implemented based on patterns of use • A practical plan can be put in place to limit repetitive skipping/misuse of the NQ buttons and selection • More advancements can be made to the UI to display information better for the students and professor • Weighting the score of the questions based on their difficulty • More complex algorithm/testing more percentage choices to get the best difficulty spread.

  42. Bonus Material

  43. Groundwork:New Variables in Screen2.php • Tested variable assignment queries • All returned results of one value • Values were correct (desired table location was read) • NOTE: calls currently use difficulty field in questions_difficulty

  44. GroundworkVariable Assignments in _screen2 • Revised assignment statements to be the following:

  45. Groundwork$last_event fix for multiple entries • Score variable was revised to return the most recent incidence of that question in the user’s table (rather than NULL for any times skipped before answering)

  46. Scheme of One Question Selection • When correct (>=50%) • Makes sure that the eligible questions for the next question are only of greater or equal difficulty. • When last question answered was incorrect • - Eligible next questions are only of lesser difficulty.

  47. Scheme of One Other Steps • Compared and cleaned queries • Set to display query for demos and testing • Discovered occurrence of running out of <harder/easier> questions

  48. Scheme of One Expanding • Expanded the queried history for the user • To include history of past three answers • This requires that at least 3 entries be in the database • Future work may work around this DB requirement • And grow to use a scheme of many past answers

More Related