1 / 18

The CSU - BAT

The CSU - BAT. A middle ground between the ETS and “going it alone”?. Regina Eisenbach Associate Dean CSU, San Marcos Kathy Krentler Director of Assessment San Diego State University Mary Wolfinbarger Associate Dean CSU, Long Beach. The Business Assessment Test Development.

gen
Download Presentation

The CSU - BAT

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The CSU - BAT A middle ground between the ETS and “going it alone”?

  2. Regina Eisenbach • Associate Dean • CSU, San Marcos • Kathy Krentler • Director of Assessment • San Diego State University • Mary Wolfinbarger • Associate Dean • CSU, Long Beach

  3. The Business Assessment Test Development • Initiated in November 2002 • Contributions from 12 campuses (50 people) • Original validity and reliability testing by CSU Long Beach • Pilot testing during Summer & Fall 2003 • Formal testing initiated Spring 2004

  4. Merits • Standardized Test • Easy to implement • Provides an opportunity to a program to leverage student performance for its marketing and promotion • Provides comparative results that are useful to benchmark program performance against other programs • Home Grown Test • Provides greater flexibility for analysis of results • More conducive to results for curricular development • Provides results that assess knowledge of core concepts that correspond to the program’s curriculum

  5. Demerits • Standardized Test • Cost per student is high, leading to student discontent if they are asked to pay for the test • Difficult to use data from results to evaluate curriculum • Norm-based test with percentile results are often misunderstood and mis-interpreted • Test questions may not reflect the curriculum of a given program, making continuous improvement difficult • Drill-down data expensive or not available • Home Grown Test • Limited external validity • Requires cooperation from the entire faculty in development to establish internal validity • Requires careful handling of data to maintain integrity of results and allow for precise analysis • Test preparation requires considerable faculty time

  6. Merits Demerits

  7. Details of the BAT • 80 multiple choice questions • 8 discipline-specific sub-tests • 75 minute exam • Scores reported overall and by sub-test • Scores broken down by major, gender, ethnicity, language spoken at home, part time vs. full time status, hours worked, and transfer student status

  8. Reliability Issues • SDSU Means Year-to-Year • PARSCORE provides reliability measures using a point bi-serial correlation between individual questions and overall test performance. Based on this analysis, 22 items on the BAT have been identified as possessing questionable reliability. Scores are now reported both Unadjusted (all 80 items) and Adjusted (22 items removed).

  9. Validity Issues • School Comparison • The most selective school in the CSU system should be expected to perform best overall on the BAT • Major Comparison • Majors should be expected to perform significantly better on the sub-test of their discipline • Curriculum Comparison • Students taking “more” of a topic should be expected to perform better on the sub-test in that topic • How does the BAT compare to the ETS? • The test should be expected to compare favorably to other valid instruments

  10. The most selective school in the CSU system should be expected to perform best overall on the BAT

  11. Majors should be expected to perform significantly better on the sub-test of their discipline SDSU data, 2008

  12. Students taking “more” of a topic should be expected to perform better on the sub-test in that topic • Statistics • Overall Performance: 37.23% • Sub-group taking only 1 Stats course 34.83% • Sub-group taking 2 Stats courses 39.58% • Among 2 course takers: • Requires “C” 44.43% • Others 36.83% • Business Law • Requires 2 law courses 55.8% • Requires 1 law course 49.8% • Requires 0 law courses 27.0% SDSU data,2007 All school data,2008

  13. The test should be expected to compare favorably to other valid instruments CSU, San Marcos; 2007

  14. More questions of interest . . . • Do results assess knowledge of core concepts that correspond to the program’s curriculum? • Do test procedures disadvantage students? • Can the BAT be used to assess critical thinking as well as discipline-specific content?

  15. Do results assess knowledge of core concepts that correspond to the program’s curriculum? SDSU data, 2007

  16. Do test procedures dis-advantage students? SDSU, 2008

  17. Can the BAT be used to assess critical thinking as well as discipline-specific content? Multi-School assessment, 2007

  18. Closing the Loop • CSU Long Beach • ACC sub-test lags behind peer schools • ACC class sizes have risen from 50 to 200 over period of decline • FIX: Work to lower class size • SDSU • Lowest sub-test performance is on courses taken in lower division • Hypothesis: Students do not retain material to graduation, need refresher • FIX: Development of online refresher modules available to all students • CSU San Marcos • Statistics is consistently the lowest sub-test performance • Faculty also see STATS as students’ weakest area • FIX: Move to hybrid class format which provides students with more time to work problems and review material

More Related