1 / 38

Education Modeling Collaboration: LANL Initial Model Implementation

LANL's initial model incorporates all core elements for the final model, including model entities, structure, process, functions, and methods. Ongoing interaction between modelers, analysts, and experts drives development, with statistical analysis providing information for populating and calibrating the model. The multi-agent modeling approach allows for detailed exploration of individual behavior and systemic factors, enabling the modeling of large populations.

dmillard
Download Presentation

Education Modeling Collaboration: LANL Initial Model Implementation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Education Modeling Collaboration:LANL Initial Model Implementation Patrick Kelly Benjamin Sims Stephan Eidenbenz Joanne Wendelberger Steven Stringer Los Alamos National Laboratory

  2. Initial model is complete • LANL’s initial model incorporates all the core elements needed for the final model. These include: • Model entities and their attributes (students, teachers, classrooms, etc.) • A structure to guide entity interactions (school system, schools, classes) • A process for assigning students and teachers to school systems, schools, classrooms, and classes • Functions for assigning scores/grades to students (based on student attributes, interactions with teachers and other students, and previous scores/grades in transcript) • Methods for capturing and displaying model outputs

  3. Model builds on input from the entire team • Ongoing interaction between modelers, statistical analysts, and education experts drives development • Previous discussions elicited a list of key parameters that influence student success, and related data sources • Model core uses a few high-level parameters from these discussions • The values of these high-level parameters will be tied to other parameters and data sources as model development continues • More complex interaction structures and class assignment processes will be implemented based on actual school districts, curriculums, and policies • Statistical analysis of SJUSD data will provide information for populating and calibrating the model

  4. Benefits of multi-agent modeling approach • Multi-agent models represent and track individuals through a social system over time • In this case, students and teachers in a school system • Useful characteristics of multi-agent models • Bottom-up approach enables more detailed exploration of connections between individual behavior and systemic factors • Greater ability to manipulate and understand factors that shape individual student and teacher performance over time • Possible to model large populations so results can be statistically compared to real student and teacher data

  5. Model contains this core set of objects • Student • Teacher • Classroom • Physical location where students and teachers interact in individual classes • School • Mechanism for grouping teachers, students, classrooms • School System • Master “control mechanism” manages simulation and generates reports • Student Transcript • Record of achievement through the years (e.g. report card). • Individual Class • An instance of a specific course, in a specific location, with a specific teacher, and specific list of students

  6. Initial model uses a basic set of object attributes (Possible future attributes in gray) • Student • Ability, grade level (motivation, socioeconomic status, age, gender, ethnicity, ESL status, …) • Teacher • Ability(motivation, gender, ethnicity, areas of expertise, …) • Classroom • (Quality) • School • Curriculum

  7. Current model runs assume a simplified school system • Students and teachers remain in the school to which they were originally assigned • All schools teach K-12 • A class lasts a full school year • All grade levels have the same set of classes (so they are more like general subject areas) • Student “score” (could represent test score or grade in the subject) is calculated once a year for each class

  8. Model cycles through this loop each year Assign Classes Assign Scores Advance to Next Year (quarter, semester, etc.)

  9. Setting up a model run • From Input Files: SchoolSystem Schools (Curriculum) Classrooms Teachers (Ability) Students (Ability, Grade Level) • Initializations: Classrooms, Teachers, and Students are all assigned to the various schools by an internal algorithm.

  10. Example: school system setup School “A” School “B” School “C” C1 C2 C3 C4 C5 C6 C7 C8 T1 T2 T3 T4 T5 T6 … … … S300 S201 S200 S202 S100 S101 S1 S102 S2

  11. Assigning classes • School has classrooms, teachers, students, and a curriculum • Curriculum is a list of each course to be taught: • Grade Level (e.g. K-12) • Course Identifier (integer code for “Math”, “English”, “Social Studies” … ) • Course Name (actual string for interpretation “Math”, “English” … ) • Number Offered (e.g. “Three different classrooms offering Second-Grade Math”) • Students are assigned to classrooms with teachers and students that vary from year to year • Realistic to have some overlap and some differences in classmates and teachers from year to year • Placing students with the same teachers and classmates each year leads to a completely static model

  12. Initial class assignment method • Need a simple method that generates realistic mix of stability and change for the initial model • Set up one Individual Class for each entry in the defined curriculum • Number of Individual Classes for each grade/course in the schedule determined by the Number Offered value specified at initialization • Use a simple round-robin “card dealing” method to assign one teacher and one classroom to each Individual Class • For now, every student is enrolled into each course at their grade level. The specific Individual Class is determined by a heuristic shuffling algorithm. This changes the overall classmate mix as student progress through the system. • For courses at grade level G, we perform a “card dealing” strategy assigning G students at a time before moving to the next Individual Class

  13. Example: Assigning students to classes Assume we have three courses offered for Fourth-Grade Math. (Truth be told, we are actually using G+1 in the code instead of G since Kindergarten is treated as “grade zero”.) Individual Class “A” Individual Class “B” Individual Class “C” Classroom: 205 Classroom: 206 Classroom: 207 Teacher: Mrs. Adams Teacher: Mr. Bradley Teacher: Mrs. Chavez S25 S17 S29 S21 S33 S1 S13 S5 S9 S34 S2 S14 S26 S6 S18 S30 S10 S22 S35 S3 S15 S27 S7 S19 S31 S11 S23 S36 S4 S16 S28 S8 S20 S32 S12 S24

  14. Assigning scores • Score assignment is key driver of student outcomes • Initial model uses the following factors assign scores • The student’s baseline “ability” to learn the material • The teacher’s baseline “ability” to teach the material • Average “ability” of students in classroom (captures interactive effects) • How the student performed last year in this subject area • These factors can be tied to a wide range of variables in the data set • In the initial model, values are assigned by the modeler • We’ve experimented with two potential score-assignment functions • Main approach: Averaging of factors • Alternative approach: Incremental influence

  15. Main approach to score assignment – averaging • All factors are weighted equally in initial model • Different weighting schemes can be tested or implemented based on data analysis and input from education experts

  16. Alternative score assignment approach – incremental influence • Produces similar output to averaging approach – useful comparison • Define scaled-sigmoid modifier function • Properties: • Function output is always between zero and two. • When x is zero, function output is one. • When x is less than zero, function output is less than one. • When x is greater than zero, function output is greater than one. • Standard way of scaling model functions, does not represent behavior

  17. Incremental influence (cont.) represents the score a student would receive in the absence of knowledge concerning scores in previous years. We then modify to consider influence (bias?) from the previous year’s score.

  18. Testing model response: Varying average student ability • Two Student Data Sets: • Set “A” has narrower range of Ability values (higher average Ability) • Randomly selected between 0.65 and 1.00 (uniform). • Set “B” has wider range of Ability values (lower average Ability) • Randomly selected between 0.55 and 1.00 (uniform). • Teacher Ability values distributed evenly between 0.78 and 1.00 • Three schools in the system. • Each has five (5) classrooms assigned during initialization. • Each has three (3) teachers assigned during initialization. • Each has sixty (60) students assigned into their K-level classes. • Follow students as they progress in one subject area through the end of 12th grade (e.g. “math”)

  19. Progression of students in group A (higher average)

  20. Progressions of students in group B (lower average, wider distribution)

  21. Isolated student score tracks from group B (for clarity)

  22. These student progressions show that: • As expected for the initial model, there is some movement over time, but no drastic changes in ranking of individuals • The range of student scores at grade 12 is similar to the range at grade K • Keep in mind that these are scores relative to grade level (if a student scores the same every year, they are progressing as expected) • So the model shows that students are, on average, progressing as expected from year to year

  23. Group B score histogram over time K 0.90 - 0.95 0.85 – 0.90 0.80 – 0.85 0.75 – 0.80 0.70 – 0.75 0.65 – 0.75

  24. Group B score histogram over time 1 0.90 - 0.95 0.85 – 0.90 0.80 – 0.85 0.75 – 0.80 0.70 – 0.75 0.65 – 0.75

  25. Group B score histogram over time 2 0.90 - 0.95 0.85 – 0.90 0.80 – 0.85 0.75 – 0.80 0.70 – 0.75 0.65 – 0.75

  26. Group B score histogram over time 3 0.90 - 0.95 0.85 – 0.90 0.80 – 0.85 0.75 – 0.80 0.70 – 0.75 0.65 – 0.75

  27. Group B score histogram over time 4 0.90 - 0.95 0.85 – 0.90 0.80 – 0.85 0.75 – 0.80 0.70 – 0.75 0.65 – 0.75

  28. Group B score histogram over time 5 0.90 - 0.95 0.85 – 0.90 0.80 – 0.85 0.75 – 0.80 0.70 – 0.75 0.65 – 0.75

  29. Group B score histogram over time 6 0.90 - 0.95 0.85 – 0.90 0.80 – 0.85 0.75 – 0.80 0.70 – 0.75 0.65 – 0.75

  30. Group B score histogram over time 7 0.90 - 0.95 0.85 – 0.90 0.80 – 0.85 0.75 – 0.80 0.70 – 0.75 0.65 – 0.75

  31. Group B score histogram over time 8 0.90 - 0.95 0.85 – 0.90 0.80 – 0.85 0.75 – 0.80 0.70 – 0.75 0.65 – 0.75

  32. Group B score histogram over time 9 0.90 - 0.95 0.85 – 0.90 0.80 – 0.85 0.75 – 0.80 0.70 – 0.75 0.65 – 0.75

  33. Group B score histogram over time 10 0.90 - 0.95 0.85 – 0.90 0.80 – 0.85 0.75 – 0.80 0.70 – 0.75 0.65 – 0.75

  34. Group B score histogram over time 11 0.90 - 0.95 0.85 – 0.90 0.80 – 0.85 0.75 – 0.80 0.70 – 0.75 0.65 – 0.75

  35. Group B score histogram over time 12 0.90 - 0.95 0.85 – 0.90 0.80 – 0.85 0.75 – 0.80 0.70 – 0.75 0.65 – 0.75

  36. These histograms show that:

  37. Linking the model to observed data • Identify measurable quantities in our data sets that we can use to quantify model parameters • Tune internal mechanisms to generate data that is statistically similar to the observed data • This will involve: • Selection of a refined score assignment function. • Tuning internal parameters for that function. • Incorporate a more realistic mechanism for class assignments • More accurately reflects what happens in real world

  38. Action plan • Continue model development to build in additional features. • Incorporate information from San Jose School District database into the modeling process. • Generate data from model that can be used for visualization of behavior over time. • Work with visualization experts to develop materials to communicate modeling results. • Work with education experts to obtain feedback and insights for future development of the model.

More Related