1 / 36

Assessment

101. Assessment. Presented by : Dr. Sharon Karackattu Jason F. Simon, M.Ed. Our Session Today. What is Assessment Vs. Research? Process of Assessment Activity: Problem to Report Tips on Literature Reviews and Types of Literature To Draw Upon. How do you see it?.

ayame
Download Presentation

Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 101 Assessment Presented by:Dr. Sharon KarackattuJason F. Simon, M.Ed.

  2. Our Session Today • What is Assessment Vs. Research? • Process of Assessment Activity: Problem to Report • Tips on Literature Reviews and Types of Literature To Draw Upon

  3. How do you see it? Note: This is an arbitrary graphical representation of the cultural differences between assessment and formal research. ALL SCORES NEED TO ADD UP TO 100

  4. Note: This is an arbitrary graphical representation of the cultural differences between assessment and formal research.

  5. Note: This is an arbitrary graphical representation of the cultural differences between assessment and formal research.

  6. Note: This is an arbitrary graphical representation of the cultural differences between assessment and formal research.

  7. Which is which? Study on male identity development for fraternity members at large public research institutions. RESEARCH Study on UNT Greek Men and their perceptions of the campus climate for Greeks. ASSESSMENT

  8. Which is which? Analyzing usage data by a population of 18-22 year old users of campus recreation center. ASSESSMENT Analyzing usage data by a population of 18-22 year old users of campus recreation centers throughout the State of Texas to identify connections between BMI and GPA. RESEARCH

  9. According to Schuh & Upcraft (2001) “Assessment guides good practice, while research guides theory development and tests concepts…Assessment typically has implications for a single institution while research typically has broader implications for student affairs and higher education” (p.5).

  10. Assessment from Problem to Report Source: Schuh, J. H., Upcraft, M. L. & Associates. (2001)

  11. Step 1: Identify the problem • What specific circumstances or situations are driving the need for assessment? • What external pressures are driving the need for assessment? • What internal circumstances are driving the need for assessment?

  12. Step 2: Determine the purpose of the study • What information do we need to help solve the problems identified in Step 1? • Rely on this information to then develop the purpose of the study – then stick with it! • Resist the temptation to earn a Nobel Peace Price or find the cure to all of the woes at UNT – keep the study focused.

  13. Step 3: Determine where to get the information needed • Is the population students? What types of students? Be specific and focused. • What does the literature say (more on that later)? • Do we have previous UNT reports, findings or datasets? • Do we have access to benchmarking studies of our peer institutions. • Would faculty, staff or alumni be helpful?

  14. Step 4: Determine the best assessment methods • What is the best way to get the information I need? • Quantitative, Qualitative or Both Oh My! • How to decide? • “What” questions are best answered by Quantitative Research Methods • “Why” questions are best answered by Qualitative Research Methods

  15. Assessment Designs • Satisfaction Surveys - doesn’t tell you enough • Program Reviews • Needs Assessment • Cost Effectiveness Study • Benchmarking Study • Campus Culture/Environmental Assessment • Demographic Profiles (local, regional and national)

  16. Step 5: Determine whom to study • Do we survey the entire student population? • Do we want to sample a segment of the student population? • Gender, age, race and ethnicity • Class standing, GPA, Full-Time, Part-Time • Commuter, resident, distance learner • Service user, student leader, student employee

  17. Step 6: Determine how data will be collected • The method(s) selected MUST be consistent with the purpose of the study • Paper questionnaires (remember to scantron it!) • Telephone survey with trained interviewer • Web-based surveys • Individual interviews • Focus Group interviews

  18. Step 7: Determine what instruments will be used • The instrument must be able to yield results that can be statistically analyzed! • Do we want to use a test from a national source where validity and reliability is assured?

  19. Reliability and Validity Not Reliable, Not Valid Reliable, Not Valid Source: Adapted from presentation for EPSY 6020 by Dr. Axelson. Not Reliable, Valid Reliable, Valid

  20. ASSESSMENT TERM ALERT:RELIABILITY Reliability refers to the extent to which assessments are consistent. Just as we enjoy having reliable cars (cars that start every time we need them), we strive to have reliable, consistent instruments to measure student achievement. Another way to think of reliability is to imagine a kitchen scale. If you weigh five pounds of potatoes in the morning, and the scale is reliable, the same scale should register five pounds for the potatoes an hour later (unless, of course, you peeled and cooked them). Likewise, instruments such as classroom tests and national standardized exams should be reliable – it should not make any difference whether a student takes the assessment in the morning or afternoon; one day or the next. Source: Classroom Assessment Course, Florida Center for Instructional Technology and University of South Florida: http://fcit.usf.edu/assessment/basic/basicc.html

  21. ASSESSMENT TERM ALERT:VALIDITY Validity refers to the accuracy of an assessment -- whether or not it measures what it is supposed to measure. Even if a test is reliable, it may not provide a valid measure.  Let’s imagine a bathroom scale that consistently tells you that you weigh 130 pounds. The reliability (consistency) of this scale is very good, but it is not accurate (valid) because you actually weigh 145 pounds (perhaps you re-set the scale in a weak moment)! Since teachers, parents, and school districts make decisions about students based on assessments (such as grades, promotions, and graduation), the validity inferred from the assessments is essential -- even more crucial than the reliability. Also, if a test is valid, it is almost always reliable. Source: Classroom Assessment Course, Florida Center for Instructional Technology and University of South Florida: http://fcit.usf.edu/assessment/basic/basicc.html

  22. Step 7: Determine what instruments will be used • The instrument must be able to yield results that can be statistically analyzed! • Do we want to use a test from a national source where validity and reliability is assured? • Do we want to develop our own instrument despite the fact validity and reliability will be a concern? • Qualitative – design open ended questions which are standardized, as well as any follow up questions or items of clarification

  23. Step 8: Determine who should collect the data • Qualified and trained individuals are a good start for all assessment • Need to be cautious about bias • Can we really trust a study that was done exclusively by a person with a personal stake in the outcomes? • Can we really trust a study that was done exclusively be outside experts with no context or awareness of the nuances of the institution? • Bias can be reduced when outside experts in assessment review the process and methods used

  24. Step 9: Determine how the data will be analyzed • Analysis of quantitative data depends on the purpose of the study: Are the respondents representative of the population? If yes, descriptive and differential statistics can then be applied in the quantitative arena (consult experts on ANOVA, MANOVA, Regression, ANCOVA, HLM or SEM) • Themes, trends and variations can be explored in the qualitative study arena and a plan developed to record and analyze this data (consult experts on Naturalistic Inquiry Methodology if needed)

  25. Step 10: Determine the implications of the study for policy and practice • We suggest you rely on your findings and present how your findings impact the campus or program area. • What are the implications of this study? • What approaches to solving the problem should be considered in light of this study? • What policies and procedures need to be created or overhauled? • What is the call for action for UNT?

  26. Step 11: Report the results effectively • Understand we must package the findings carefully so that it motivates change • How a study is distributed and formatted may be more important than the results found • Segment your audience and tailor specific reports to specific audiences • Make sure the results get into the right hands of people who can make a difference

  27. Keys To A Good Literature Review • Where does it come from? • Sources • Peer reviewed journal articles • Journal of College Student Development • Review of Higher Education • Research in Higher Education • Journal of Higher Education • Reputable edited journal articles • “New Directions” series • Change magazine, Liberal Education, Peer Review, Phi Delta Kappan, Assessment Update • Books from reputable authors and publishers

  28. Keys To A Good Literature Review Other sources: • Conference presentations, proceedings • Published reports by universities, federal, state, or local government, or independent research or policy institutions • Dissertations or theses • ERIC ED documents • Reputable magazines or newspaper (for facts) • Reference books

  29. Keys To A Good Literature Review Don’t quote these… • Wikipedia • Individual web pages and blogs • Personal communications • Newspaper or magazines (for research findings) • Secondary sources

  30. Keys To A Good Literature Review Where do I start my search? • Education Resources Information Center (ERIC) http://www.eric.ed.gov/ • PsycInfo(psychology related research) • LexisNexis (policy, law, business news) • UNT Education Electronic Resources http://tinyurl.com/untedu • Google Scholar http://scholar.google.com • Book or journal article reference list

  31. Keys To A Good Literature Review Alternative sources of information: • Ask around (professors and co-workers) • Online discussion forums • Amazon.com • Librarians at UNT • Internet search engines (as a starting point) • Wikipedia (as a starting point)

  32. And Now For A Shameless Plug About PASD We can offer assistance in: • Assessment design • Quantitative and Qualitative Methods • Focus Group Management • Literature Review Hints • Obtaining UNT specific data from campus partners • Benchmarking and Peer Institution first steps

  33. Where is our office located? Dr. Karackattu and Jason Simon are located in the Student Activities Office, Room number 320 MH (down the hall from the conference table in the middle of the suite) OR ONLINE: www.unt.edu/pasd

  34. Contact Us: Dr. Jan Hillman Executive Director of Planning and Advancement Jan.hillman@unt.edu 565.4909 Dr. Sharon Karackattu Research Coordinator skarackattu@dsa.admin.unt.edu 369.8047 Jason F. Simon Graduate Research Assistant Jason.simon@unt.edu 369.8054

  35. References Chen, P.S. D. (2008). EDHE 6530 Research on Students in Higher Education Course. Slides 24-28 reproduced with permission of instructor. Classroom Assessment. Collaboration between Florida Center for Information Technology and University of South Florida. Retrieved September 25. 2008: http://fcit.usf.edu/assessment/basic/basicc.html. Schuh, J. H., Upcraft, M. L. & Associates. (2001). Assessment practice in studentaffairs: An applications manual. Jossey-Bass. San Francisco, CA.

  36. Q&A

More Related