reinventing cs curriculum and other projects at the university of nebraska l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Reinventing CS Curriculum and Other Projects at The University of Nebraska PowerPoint Presentation
Download Presentation
Reinventing CS Curriculum and Other Projects at The University of Nebraska

Loading in 2 Seconds...

play fullscreen
1 / 40

Reinventing CS Curriculum and Other Projects at The University of Nebraska - PowerPoint PPT Presentation


  • 364 Views
  • Uploaded on

Reinventing CS Curriculum and Other Projects at The University of Nebraska Leen-Kiat Soh Computer Science and Engineering NCWIT Academic Alliance November Meeting 2007 Introduction Reinventing CS Curriculum Project Placement Exam Learning Objects Closed Labs CS1, CS2 Educational Research

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Reinventing CS Curriculum and Other Projects at The University of Nebraska' - johana


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
reinventing cs curriculum and other projects at the university of nebraska

Reinventing CS Curriculumand Other Projects atThe University of Nebraska

Leen-Kiat Soh

Computer Science and Engineering

NCWIT Academic Alliance

November Meeting 2007

introduction
Introduction
  • Reinventing CS Curriculum Project
    • Placement Exam
    • Learning Objects
    • Closed Labs CS1, CS2
    • Educational Research
  • Computer-Aided Education
    • I-MINDS (Computer-Supported Collaborative Learning)
    • ILMDA (Intelligent Tutoring System)
    • Affinity Learning Authoring System
placement exam
Placement Exam
  • The primary purpose of the placement test
    • Place students into one of CS0, CS1, and CS2
  • Our approach emphasizes both pedagogical contexts and validation of the test
    • Placement exams we researched
      • Not used as a pre- and post-test
      • Do not explicitly consider pedagogical contexts such as Bloom’s taxonomy
      • Results not used to improve course instruction
      • No formative or summative analyses available

Reinventing CS Curriculum

placement exam4
Placement Exam
  • 10 major content areas
    • based on ACM/IEEE Computing Curricula 2001
      • Functions, sets, basic logic, data structures, problem solving, representation of data, etc.
    • addressed in the CS0 and CS1 courses
    • students’ knowledge are tested at multiple levels of competency based on Bloom’s Taxonomy
    • First five (25 questions) address prerequisite skills; second five (25 questions) represent the topics students are expected to know after completion of CS1

Reinventing CS Curriculum

placement exam5
Placement Exam

Bloom’s Taxonomy

1. Knowledge/Memory

2. Comprehension

3. Application

4. Analysis

5. Synthesis

6. Evaluation

Reinventing CS Curriculum

placement exam statistics
Placement Exam | Statistics
  • Degree of difficulty (mean)
    • The percentage of test takers who answer the question correctly
      • Too easy or too difficult – not a meaningful discriminator
    • Targeted mean for each question is between 0.40 and 0.85
  • Item-total correlation
    • Shows the strength of the relationship between the students’ response to a question and their total score
      • A good question should have a strong positive correlation between the two
    • 0.3 is generally regarded as a good target, 0.2 is acceptable
  • Frequency of response for the choices
    • Unpicked choices are not providing any discrimination and should either be modified or dropped

Reinventing CS Curriculum

placement exam reliability validity
Placement Exam | Reliability & Validity
  • Internal Consistency Reliability
    • A measure of item-to-item consistency of a student’s response within a single test
    • Cronbach’s alpha statistic [0 – 1]
      • Results show 0.70 to 0.74, which is acceptable for research purposes
      • Goal is to obtain 0.80 or higher
  • Content Validity
    • Determined by expert opinion by CSE faculty
  • Predictive Validity
    • Determined by correlating a student’s total score on the placement test with his/her exam scores in the course
      • E.g., 0.58 for Spring 2004

Reinventing CS Curriculum

placement exam implementation
Placement Exam | Implementation
  • Duration: 1 hour
  • 50 questions
  • 10 content areas
    • 5 questions in each area
    • Each question is classified into one of the Bloom’s competence level
  • Students are not informed of the competence levels
  • The presentation order is by the competence level within each content area
    • “knowledge” first, then “comprehension”, and so on.
  • Placement recommendation cutoffs
    • Greater than or equal to 10/25  CS1
    • Greater than or equal to 35/50  CS2
    • Otherwise  CS0

Reinventing CS Curriculum

placement exam some results
Placement Exam | Some Results
  • Pre-Post comparisons
    • T(63) = 11.036, p<.001; highly significant
      • Instructional effectiveness of the CS1 validated
    • Significant predictor of total test scores in CS1
      • Test’s predictive validity

* Spring 2004 session

Reinventing CS Curriculum

placement exam some results10
Placement Exam | Some Results
  • Students who scored 48% or better vs. students who scored less
    • A one-way ANOVA found a significant difference between these two groups on total course points
      • F(1,64) = 4.76, p. < 0.5
    • Students who scored higher on the placement test received a higher grade in the course
  • Pre-Post Test
    • Overall Test: T(68) = 11.81, p < 0.001
    • Individual Bloom’s category: All show highly significant results (p < 0.001)
      • Greatest improvement on “knowledge” questions: t(68) = 8.27, p < 0.001)

Reinventing CS Curriculum

learning objects
Learning Objects
  • Development of web-based learning objects on “Simple Class” and “Recursion”
    • Small, stand-along “chunks” of instruction
    • SCORM – compliant (Shareable Content Object Reference Model)
    • Operating within Blackboard Course Management System
    • With extensive tracking for data collection

Reinventing CS Curriculum

learning objects12
Learning Objects
  • Tutorial component

Reinventing CS Curriculum

learning objects13
Learning Objects
  • Tutorial component

Reinventing CS Curriculum

learning objects14
Learning Objects
  • Real-world examples component

Reinventing CS Curriculum

learning objects15
Learning Objects
  • Practice exercises component

Reinventing CS Curriculum

learning objects16
Learning Objects
  • Assessment component

Reinventing CS Curriculum

learning objects17
Learning Objects
  • Self-paced, with learner control of additional practice
  • Extensive, elaborative feedback for remediation and instruction
  • Tracking System
    • Student outcomes and time-spent data captured in real time
    • Provides data on students’ problems and progress

Reinventing CS Curriculum

learning objects some results
Learning Objects | Some Results
  • No significant difference between lab and learning object instruction
  • Evaluation results showed positive student response to the learning objects
  • Modular, web-based learning objects can be used successfully for independent learning and are a viable option for distance learning

Reinventing CS Curriculum

closed labs
Closed Labs
  • Closed labs have multiple advantages
    • Active learning through goal-oriented problem solving
    • Promote students’ cognitive activities in comprehension and application
    • Some evidence that students test performance improves
    • Facilitates cooperative learning

Reinventing CS Curriculum

closed labs design
Closed Labs | Design
  • Lectures
  • 2-hour laboratory (16 weeks)
  • 20 – 30 students per lab
  • Provide students with structured, hands-on activities
  • Intended to reinforce and supplement the material covered in the course lectures
  • Majority of the time allocated to student activities

Reinventing CS Curriculum

closed labs design21
Closed Labs | Design
  • A set of core topics are based on
    • Lecture topics
    • Modern software engineering practices
    • Computing Curricula 2001
  • We developed 5 components for each laboratory
    • Pre-Tests
    • Laboratory Handouts
    • Activity Worksheets
    • Instructor Script
    • Post-Tests

Reinventing CS Curriculum

closed labs design22
Closed Labs | Design
  • Pre-Tests
    • Students are required to pass an on-line test prior to coming to lab
    • May take it multiple times
    • Passing score : 80%
    • Intended to encourage students to prepare for the lab and test their understanding of the lab objectives
    • Questions are categorized according to Bloom’s Taxonomy

Reinventing CS Curriculum

closed labs design23
Closed Labs | Design
  • Laboratory Handouts
    • Lab objectives
    • Activities students will perform in the lab (including the source code where appropriate),
    • Provide references to supplemental materials that should be studied prior to the lab
    • Additional materials that can be reviewed after the student has completed the lab

Reinventing CS Curriculum

closed labs design24
Closed Labs | Design
  • Activity Worksheets
    • Students are expected to answer a series of questions related to the specific lab activities
    • Record their answers on a worksheet (paper)
    • Questions provide the students with an opportunity to regulate their learning
    • Used to assess the student’s comprehension of the topics practiced in the lab

Reinventing CS Curriculum

closed labs design25
Closed Labs | Design
  • Instructor Script
    • The lab instructor is provided with an instructional script
    • Includes supplemental material that may not be covered during lecture, special instructions for the lab activities, hints, and resource links
    • Space for comments and suggestions

Reinventing CS Curriculum

closed labs design26
Closed Labs | Design
  • Post-Tests
    • During the last ten minutes of each lab, students take an on-line test
    • One-time-only
    • Another measure of their comprehension of lab topics
    • Questions are categorized according to Bloom’s Taxonomy

Reinventing CS Curriculum

closed labs some results
Closed Labs | Some Results
  • Study 1: To determine the most effective pedagogy for CS1 laboratory achievement
    • Participants: 68 students in CS1, Fall 2003
    • Procedures
      • Structured cooperative groups had prescribed roles (driver and reviewers)
      • Unstructured cooperative groups did not have prescribed roles
      • Direct instruction students work individually
      • Randomly assigned the pedagogy of each lab section
      • Used stratified random assignment to assign students to their cooperative groups within each section
        • Based on ranking of the placement test scores for this course (high, middle, low)

Reinventing CS Curriculum

closed labs some results28
Closed Labs | Some Results
  • Study 1, Cont’d …
    • Dependent Measures
      • Total laboratory grades
        • Combined worksheet scores and post-test grades for each lab
        • Although some students work in groups, all students were required to take the post-test individually
      • Pre-Post-Test measuring self-efficacy and motivation
        • Taken during the first and last week of the semester
        • Adapted 8 questions taken from Motivated Strategies for Learning Questionnaire by Pintrich and De Groot (1990)
        • Returned a reliability measure (Cronbach’s alpha) of .90 with a mean of 3.45 and standard deviation of .09; good reliability

Reinventing CS Curriculum

closed labs some results29
Closed Labs |Some Results
  • Results of Study 1
    • Both cooperative groups performed significantly better than the direct instruction group (F(2,66) = 6.325, p < .05)
      • Cooperative learning is more effective than direct instruction
      • No significant difference between the structured cooperative and unstructured cooperative groups
    • 6 out of 8 questions showed statistically significant changes in student perceived self-efficacy and motivation
closed labs some results30
Closed Labs |Some Results
  • Study 2
    • Same objective; revised motivation/self-efficacy tool, additional qualitative feedback; revised laboratories
    • Participants: 66 students in CS1, Spring 2004
    • Results
      • Both cooperative groups performed better than the direct instruction group (F(2,64) = 2.408, p < .05)
    • Discussion
      • Similar conclusions
computer aided education
Computer-Aided Education
  • Studies on the use of Computer-Supported Collaborative Learning (CSCL) tools
    • I-MINDS
    • structured cooperative learning (Jigsaw) vs. non-structured cooperative learning
    • CSCL vs. non-CSCL
  • Studies on the use of Intelligent Tutoring System (ITS)
    • ILMDA
    • ITS vs. Lab
    • ITS+Lab vs. Lab
  • Studies on the use of authoring tools
    • Affinity Learning Authoring System
    • How authoring tools impact learning
    • Graphical vs. non-graphical
ongoing work
Ongoing Work
  • Summer Institute with Center for Math, Science, and Computer Education
    • Teaching multimedia computing to student-teachers
  • NSF Advanced Learning Technologies Project
    • Intelligent Learning Object Guide (iLOG)
    • Developing SCORM-standard metadata to capture use characteristics of learning objects and student models
    • Developing software to automatically capture and generate metadata to tag learning objects
    • Creating SCORM-compliant learning objects for CS0, CS1, CS2
ongoing work 2
Ongoing Work 2
  • Renaissance Computing
    • Joint curricular programs with other departments
      • School of Biological Sciences
      • School of Music
      • College of Agricultural Sciences and Natural Resources
      • Digital Humanities
    • Multi-flavored introductory CS courses
      • Object first vs. traditional
      • Multimedia, Engineering, Life Sciences, Arts
ncwit academic alliance focus
NCWIT Academic Alliance Focus
  • Renaissance Computing
    • Multi-flavored introductory CS courses in conjunction with joint curricular programs with other departments (that have larger female populations) to promote more female participation in CS
  • Computer-Aided Education
    • Online learning objects for K-12 teachers to help them expose their students to computational thinking and real-world IT applications
    • Collaborative writing (via I-MINDS) for secondary female students on the use of CS paradigms to solve real-world problems
  • Reinventing CS Curriculum
    • Use placement exam as pre- and post-tests for future studies on learning performance of female students
    • Use cooperative learning in labs to recruit and improve retention of female students
people
People
  • Rich Sincovec, CSE Department Chair
  • Reinventing CS Curriculum Project
    • Leen-Kiat Soh, Ashok Samal, Chuck Riedesel, Gwen Nugent
  • Computer-Aided Education
    • Leen-Kiat Soh, Hong Jiang, Dave Fowler, Art Zygielbaum
others
Others
  • UNL
    • College of Education and Human Sciences
    • Center for Math, Science, and Computer Education
    • J.D. Edwards Honors Program (CS+Business)
    • Extended Education and Outreach (AP Courses)
    • Department of History, School of Biological Sciences, School of Music, etc.
  • Bellevue University (I-MINDS)
  • University of Wisconsin-Madison ADL Co-Lab (learning objects)
publications
Publications
  • Reinventing CS Curriculum
    • Framework
      • L.-K. Soh, A. Samal, and G. Nugent (2007). An Integrated Framework for Improved Computer Science Education: Strategies, Implementations, and Results, Computer Science Education, 17(1):59-83
    • Learning Objects
      • G. Nugent, L.-K. Soh, and A. Samal (2006). Design, Development and Validation of Learning Objects, Journal of Educational Technology Systems, 34(3):271-281
publications 2
Publications 2
  • Reinventing CS Curriculum, Cont’d …
    • Placement Exam
      • G. Nugent, L.-K. Soh, A. Samal, and J. Lang (2006). A Placement Test for Computer Science: Design, Implementation, and Analysis, Computer Science Education, 16(1):19-36
    • Structured Labs & Cooperative Learning
      • J. Lang , G. Nugent, A. Samal, and L.-K. Soh (2006). Implementing CS1 with Embedded Instructional Research Design in Laboratories, IEEE Transactions on Education, 49(1):157-165
      • Soh, L.-K., G. Nugent, and A. Samal (2005). A Framework for CS1 Closed Laboratories, Journal of Educational Resources in Computing, 5(4):1-13
publications 3
Publications 3
  • Computer-Aided Education
    • Computer-Supported Collaborative Learning
      • L.-K. Soh, N. Khandaker, and H. Jiang (2007). I-MINDS: A Multiagent System for Intelligent Computer-Supported Cooperative Learning and Classroom Management, to appear in Int. Journal of Artificial Intelligence in Education
    • Intelligent Tutoring System
      • L.-K. Soh and T. Blank (2007). Integrating Case-Based Reasoning and Multistrategy Learning for a Self-Improving Intelligent Tutoring System, to appear in Int. Journal of Artificial Intelligence in Education
    • Affinity Learning Authoring Tool
      • L.-K. Soh, D. Fowler, and A. I. Zygielbaum (2007). The Impact of the Affinity Learning Authoring Tool on Student Learning, to appear in J. of Educational Technology Systems