1 / 29

Review of Measurement & Effect Size Reporting in Quantitative Education Dissertations

Review of Measurement & Effect Size Reporting in Quantitative Education Dissertations. Debbie L. Hahs-Vaughn University of Central Florida Tary L. Wallace University of South Florida Sarasota-Manatee Melinda Stevison University of Central Florida. Introduction.

isra
Download Presentation

Review of Measurement & Effect Size Reporting in Quantitative Education Dissertations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Review of Measurement & Effect Size Reporting in Quantitative Education Dissertations Debbie L. Hahs-VaughnUniversity of Central Florida Tary L. WallaceUniversity of South Florida Sarasota-Manatee Melinda StevisonUniversity of Central Florida

  2. Introduction • Efforts to continuously improve the quality of performance outcomes span nearly every level and type of educational experience including doctoral preparation. • Although recently there has been some focus on the types of research designs included in graduate level training, few studies have been devoted to examining the reporting of measurement and effect size in dissertations.

  3. Purpose of the Study • The purpose of this study is to describe characteristics of reporting measurement and effect sizes in quantitative dissertations within the field of education.

  4. Theoretical Framework • “Dissertation quality may be seen by accreditation and coordinating-board reviewers as a noteworthy reflection on the quality of doctoral programs themselves."(Thompson, 1994, abstract)

  5. Theoretical Framework • Over the past several decades there has been some interest in the quality of dissertations, and in educational research in general. • Ward, Hall, and Schramm (1975) • Hall, Ward, and Comer (1988) • Thompson (1988; 1994)

  6. Reliability & Validity • Widely known expectations • Guidelines regarding best practices • Instruction on the topic • Concerns from research and theory • Reporting of research • Preparation of researchers

  7. Reliability & Validity Guidelines related to best practices • Standards for Educational & Psychological Testing (AERA, APA, NCME, 1999) • Code of Fair Testing Practices in Education (Revised) (Joint Committee on Testing Practices, 2004) • Program Evaluation Standards (Joint Committee on Standards for Educational Evaluation, 1994) • AERA Guidelines for Reporting Research (2006)

  8. Reliability & Validity Instructional Resources • Research design textbooks • Gall, Gall, & Borg (2007) • Gay & Airasian (2007) • Fraenkel & Wallen (2006)

  9. Reliability & Validity Concerns from research and theory • Validity • Messick (1989) • Reliability • Thompson (1994, 2003) • Preparation of researchers/graduate students • Goodwin & Goodwin (1985) • Mundfrom, Shaw, Thomas, Young, & Moore (1998) • Educational Researcher, Vol. 30, No. 5 (2001)

  10. Effect Size • Criticism with interpreting null hypothesis significance testing • Use of effect size to thwart criticism of null hypothesis significance testing • The first effect size measure was developed by Cohen in 1969 • Effect size indices today

  11. Reporting Effect Size • APA Task Force on Statistical Inference recommends reporting the direction, size, and confidence interval of the effect • During June 2006, the American Educational Research Association adopted standards for reporting empirical research, one of which included reporting effect size for each statistical result presented when conducting either descriptive or inferential quantitative analysis • Over 20 journals require effect size to be reported (Grissom & Kim, 2005)

  12. Reporting Effect Size • When to report effect size • Which effect size measure to report • What information to include when reporting effect size

  13. Methods: Research Design • Borrows from content analysis, meta-analysis, and longitudinal studies

  14. Methods: Data Source • Dissertations completed by students who attained their doctoral degree (Ed.D. or Ph.D.) in education at one of the four largest education doctoral degree awarding institutions in a southeastern state between the beginning of fall 2002 and the end of spring 2005. • The four largest education doctoral degree awarding institutions were identified by the state’s department of education website which listed the number of degrees awarded per institution and degree type.

  15. Methods: Preliminary Procedures: Comparability of Graduate Training Programs • Comparability of research training between the education doctoral degree programs at the institutions sampled was determined by reviewing online program information • University A: 23 doctoral degrees (Ed.D. and Ph.D.) in education • University B: eight doctoral degrees • University C: 21 doctoral degrees • University D: 21 doctoral degrees

  16. Methods: Preliminary Procedures: Comparability of Graduate Training Programs • A spreadsheet was used to catalog: • Program web site address • Type(s) of doctorate offered • Composition of dissertation committees • Required statistics, research methods, measurement, and assessment courses • Course prefix and number of these required courses, along with syllabus (if available online) and number of credits conveyed • Whether program’s research core course requirements were program-specific, or college-wide

  17. Methods: Preliminary Procedures: Comparability of Graduate Training Programs • University A: • No college-wide research requirement • The most widely required courses included two statistics courses (one basic quantitative, one advanced quantitative) and two research methods courses (one general research methods, one qualitative research methods) • The next most frequently required courses were on the topics of measurement theory, multivariate analysis, and qualitative methods

  18. Methods: Preliminary Procedures: Comparability of Graduate Training Programs • University B: • Did have college-wide research requirement • Consisted of two interdisciplinary research courses, one qualitative research course, one quantitative research course, and one required measurement class • Doctoral students also required to choose one research elective from a selection that included advanced quantitative and qualitative methods • The total number of credit hours in the research core was 18

  19. Methods: Preliminary Procedures: Comparability of Graduate Training Programs • University C: • Did have college-wide research requirement • Students could choose from four qualitative research methods courses, seven quantitative research methods courses, and six measurement courses • The minimum number of required credit hours in the research core was 12; departments and advisors had the ability to require more

  20. Methods: Preliminary Procedures: Comparability of Graduate Training Programs • University D: • Did have college-wide research requirement • Required one beginning and one advanced statistical analysis class • Students were asked to choose one additional course from a selection of two quantitative methods courses, two qualitative methods courses, and two measurement courses • The total research core totaled 11 to 16 credit hours

  21. Methods: Preliminary Procedures: Comparability of Graduate Training Programs • Comparison summary • College-wide research requirements made Universities B, C, and D easier to compare directly • University A’s research core requirements varied widely among its 23 doctoral programs in education • All four universities appeared to require one beginning and one advanced statistical analysis class, as well as at least one course on qualitative research methods • More advanced quantitative research courses were available at each university, but not necessarily required • Measurement courses were not generally required • The total research core hours required by the four universities ranged from 11 to 18 credit hours

  22. Methods: Inclusion & Exclusion • Inclusion Criteria • The dissertation was available in English. • The dissertation employed an inferential quantitative analytic procedure to address the primary research question. • There were no dissertations excluded because of the language criteria. All studies were available in English. • These criteria excluded the following: • Dissertations in which it was unable to determine from the abstract and/or 24 page preview in Dissertation Abstracts the method used (i.e., qualitative, quantitative, mixed mode). • Meta-analysis (this criteria was established because of the tendency of meta-analysis studies to refer readers to other sources for information about measurement and the data gathering process (Whittington, 2003). • Dissertations that employed qualitative or mixed mode techniques.

  23. Methods: Information Retrieval • Electronic search of Dissertations Abstract • To locate relevant dissertations in the database, the school name was included in the ‘school name/code’ search field. The term ‘education’ was included in the ‘subject name/code’ search field with a Boolean ‘OR’ to include the term ‘education’ in the ‘citation and abstract’ search field. • Given that the academic year ranges from July 1 to June 30 of each year, a specific date range was searched including 7/1/2002 to 6/30/2005.

  24. Methods: Information Retrieval • Review of abstract and 24-page preview • name of university, • author's first, middle, and last names, • type of degree (Ed.D. or Ph.D.), • college from which the degree was awarded, • name of department or program, • month or semester of publication, • year of publication, • research design (coded as qualitative, quantitative, or mixed mode), • number of pages, • title of dissertation, and • whether the dissertation was retrievable from Dissertations Abstracts in full pdf.

  25. Methods: Creation of the Sampling Frame: Step 1 • 991 dissertations identified initially • 381 dissertations identified as using quantitative techniques and awarded to students from Colleges of Education

  26. Methods: Creation of the Sampling Frame: Step 2 • Dissertations that did not employ an inferential analysis for their primary research question were excluded. • The second step of the sampling frame procedure was therefore to examine the 381 quantitative dissertations to determine both the primary research question as well as the specific analytic procedure.

  27. Methods: Study Coding • Design and Development of Protocol • Table of Specifications • Pilot with discussion by authors • Found high inter-rater agreement • Planned Estimations • Expert content review • Inter-rater agreement • Intra-rater consistency

  28. Next Steps • Selecting the sample • Retrieval of sampled dissertations • Completion of study coding

  29. Questions? • Debbie L. Hahs-Vaughndhahs@mail.ucf.edu • Tary L. Wallacetlwallace@sar.usf.edu • Melinda Stevisonmelindastevison@mac.com

More Related