meta analysis of personnel selection tests overview of situational judgment tests l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Meta-Analysis of Personnel Selection Tests & Overview of Situational Judgment Tests PowerPoint Presentation
Download Presentation
Meta-Analysis of Personnel Selection Tests & Overview of Situational Judgment Tests

Loading in 2 Seconds...

play fullscreen
1 / 123

Meta-Analysis of Personnel Selection Tests & Overview of Situational Judgment Tests - PowerPoint PPT Presentation


  • 182 Views
  • Uploaded on

Meta-Analysis of Personnel Selection Tests & Overview of Situational Judgment Tests. Michael A. McDaniel Virginia Commonwealth University mamcdani@vcu.edu. Deborah L. Whetzel Human Resources Research Organization dwhetzel@humrro.org. Nhung Nguyen Towson University ntnhung@aol.com.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

Meta-Analysis of Personnel Selection Tests & Overview of Situational Judgment Tests


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
    Presentation Transcript
    1. Meta-Analysis of Personnel Selection Tests&Overview of Situational Judgment Tests Michael A. McDaniel Virginia Commonwealth University mamcdani@vcu.edu Deborah L. Whetzel Human Resources Research Organization dwhetzel@humrro.org Nhung Nguyen Towson University ntnhung@aol.com Prepared for: International Workshop on “Emerging Frameworks and Issues for S&T Recruitments” Society for Reliability Engineering, Quality and Operations Management (SREQOM) Delhi, India September, 2008

    2. Overview • Introduction—Dr. McDaniel • Introduction—Dr. Whetzel • Introduction—Dr. Nguyen • Meta-analysis • Meta-analysis results • What are SJTs? • Brief history of SJTs • Item characteristics, response instructions, and item heterogeneity • Steps in developing SJTs • Scoring SJTs SREQOM, Delhi, India

    3. Dr. McDaniel’s Department • Department of Management, School of Business, Virginia Commonwealth University, Richmond, Virginia • 100 miles south of Washington, DC SREQOM, Delhi, India

    4. Dr. McDaniel’s Department • PhD program in Management emphasizes organizational behavior and human resources. • The Center for the Advancement of Research Methods and Analysis (CARMA) is a non-profit unit of the School of Business at Virginia Commonwealth University (VCU). • Established in 1997 by Dr. Larry Williams • Hosted over 60 events and 100 presentations on research methods topics for faculty and doctoral students world-wide. • Interdisciplinary focus, with emphasis on topics relevant to the social and organizational sciences. (www.pubinfo.vcu.edu/carma/) SREQOM, Delhi, India

    5. Dr. McDaniel’s Research Theme Applications of meta-analysis to examine the validity of personnel selection methods: • Cognitive ability tests • Interviews • Reviews of training and experience • Customer service tests • Firefighter tests • Short-term memory tests • Job experience • Job knowledge • Situational judgment tests (SJTs) SREQOM, Delhi, India

    6. Dr. Whetzel’s Organization • Human Resources Research Organization (HumRRO) in Alexandria, Virginia • 5 miles from Washington DC SREQOM, Delhi, India

    7. The Human Resources Research Organization (HumRRO) • Independent non-profit research organization • Established in 1951 as part of the U.S. Army • Became independent in 1969 • Headquarters in Alexandria, VA • Diverse staff: industrial/organizational psychologists, instructional designers, statisticians, management analysts, web programmers • 100 professional staff, 20 support staff • Strong history in selection, assessment, training, and evaluation SREQOM, Delhi, India

    8. Deborah Whetzel • Experience in personnel selection research and development • Areas of expertise include: • conducting job analyses, • developing competency models, • developing performance appraisal systems, and • developing and validating assessment processes, including structured interviews and SJTs. SREQOM, Delhi, India

    9. Dr. Nguyen’s Department • Towson University, Towson, Maryland • 60 miles north of Washington,DC SREQOM, Delhi, India

    10. Nhung Nguyen • Experience in personnel selection research and development • Areas of expertise include: • Situational judgment test research, including subgroup differences • Wrote monograph for the International Personnel Management Association Assessment Council (IPMAAC) SREQOM, Delhi, India

    11. Meta-Analyses of Personnel Selection Tests

    12. What is Meta-Analysis? • Meta-analysis is the quantitative combination of information from multiple empirical studies to produce an estimate of the overall magnitude of a relationship between an employment test and job performance. • Meta-analysis uses statistical procedures to determine the best estimate of the correlation between test and job performance. SREQOM, Delhi, India

    13. Meta-Analysis • Although meta-analysis can be statistically complicated, conceptually it is simple. • In a meta-analysis of employment tests, one averages the correlations between the test and job performance across studies. • The correlations are typically called “validity coefficients.” • Weight the studies so that studies with greater numbers of participants have more effect on the average. SREQOM, Delhi, India

    14. Meta-Analysis • In addition to calculating the mean (average) validity, one also looks at the variability around the mean. • Some of the variability is due to random causes (sampling error). • Particularly with smaller sample studies, results will vary due to some samples being more representative of the population than other samples. SREQOM, Delhi, India

    15. Meta-Analysis • The next few slides may be difficult for those without statistical training. • Do not worry if you do not understand all of it. • We will get to the typical validities of different types of selection tests soon. • These typical validities are what one needs to make informed decisions. SREQOM, Delhi, India

    16. Meta-Analysis and Sampling Error Sampling Error Sample Size SREQOM, Delhi, India

    17. Meta-Analysis and Sampling Error • The relationship between sampling error and sample size is asymptotic. • Increasing sample size results in decreasing random sampling error • As sample size increases, one gets diminishing returns in the reduction of random sampling error. SREQOM, Delhi, India

    18. Meta-Analysis and Sampling Error • Meta-analysis combines the results from many different studies. • Sampling error across studies tends to cancel out (sampling errors in one direction will be balanced by the sampling errors in the other direction). • The meta-analytic result gives a close approximation to the population value. SREQOM, Delhi, India

    19. Meta-Analysis and Moderators • Some of the variability is likely due to the test working better for some jobs than others. • These job effects are moderators of the correlation between the test and job performance. • Extroversion is better predictor of sales jobs than most other jobs. • Cognitive ability tests are good predictors of all jobs but work the best for cognitively demanding jobs (i.e., S&T jobs) SREQOM, Delhi, India

    20. Meta-Analysis • Meta-analysis tries to partition variability in the studies to better understand it. Artifactual variance in addition to sampling error Sampling Error Moderator SREQOM, Delhi, India

    21. Meta-Analysis • So far, we have briefly talked about random sampling error and moderators. • But the graph shows some variance called “artifactual variance in addition to moderators.” • Measurement error and range restriction differences across studies are this additional artifactual variance. SREQOM, Delhi, India

    22. Meta-Analysis: Measurement Error • All measures (job performance ratings) have some measurement error. • The more the measurement error, the lower the reliability of the measure. • Measurement error causes an observed effect size to underestimate its population parameter. • Differences across studies in measurement error cause variance. SREQOM, Delhi, India

    23. Meta-Analysis: Range Restriction • Personnel selection literature has range restriction due to the pre-selection of the sample. • Seek to know the value of a predictor of performance but only have job performance data for the sample members who score high on the predictor. • Range restricted samples tend to underestimate the validity coefficient. SREQOM, Delhi, India

    24. Meta-Analysis and Artifactual Variance • Meta-analysis methods try to estimate the effects of measurement error and range restriction when determining the mean validity of an employment test and the extent to which it accounts for variability in validities. SREQOM, Delhi, India

    25. Meta-Analysis: Publication Bias • A growing concern in the meta-analysis literature is whether the studies available to average are representative. • Some employment test publishers have been known to suppress studies that make their testing products look bad. • This causes the studies available to the reviewer to overestimate the validity of the test. SREQOM, Delhi, India

    26. Meta-Analysis: Know what you are predicting • Another consideration is what measures of job performance are used. • For example, integrity test validity studies often use a self-report of employee theft as the measure of job performance. Far fewer studies correlate the tests with job performance as measured by more common measures (e.g., supervisor ratings). SREQOM, Delhi, India

    27. Meta-Analysis: Summary of Validity • Schmidt and Hunter (1998) summarized 85 years of research findings • The next table shows validity for predicting job performance (typically, supervisor ratings) • All corrected for downward bias due to measurement error, range restriction using incumbent samples • A second table presents results for personality tests. SREQOM, Delhi, India

    28. Meta-Analysis Results SREQOM, Delhi, India

    29. Meta-Analysis Results for Personality From Hurtz & Donovan, 2000 SREQOM, Delhi, India

    30. Situational Judgment Tests

    31. What Are SJTs? • An applicant is presented with a situation and several response options and is asked to evaluate the responses. • SJT items are typically in a multiple choice format. SREQOM, Delhi, India

    32. Everyone in your work group has received a new computer except you. What is the best action to take? A. Assume it was a mistake and speak to your supervisor. B. Ask your supervisor why you are being treated unfairly. C. Take a new computer from a co-worker’s desk. D. Complain to human resources. E. Quit. SREQOM, Delhi, India

    33. Brief History • Judgment scale in the George Washington University Social Intelligence Test (Moss, 1926) • Used in World War II by psychologists working for the US military • Practical Judgment Test (Cardall, 1942) SREQOM, Delhi, India

    34. Brief History continued • How Supervise? (File & Remmers, 1948) • 1949 Test of Supervisory Judgment (Richardson, Bellows & Henry, 1949) • 1960’s SJTs were in use at the U.S. Civil Service System (Greenberg, 1963) SREQOM, Delhi, India

    35. Brief History continued • 1990’s Motowidlo reinvigorated interest in SJTs (Motowidlo et al. 1990; Motowidlo, & Tippins, 1993) • “Low fidelity” simulations • 1990’s Sternberg “tacit knowledge” tests (Sternberg et al, 1993, 1995; Wagner & Sternberg, 1991) SREQOM, Delhi, India

    36. Brief History continued • Today, SJTs are used in many organizations, are promoted by various consulting firms, and are researched by many. SREQOM, Delhi, India

    37. Brief History continued • Current popularity is based on assertions that SJTs: • Have low adverse impact (subgroup differences) • Have good acceptance by applicants • Assess job-related knowledge or skills not readily tapped by other measures SREQOM, Delhi, India

    38. Item Characteristics • Item stems can be distinguished along five characteristics: • Fidelity • Length • Complexity • Comprehensibility • Nested and un-nested stems SREQOM, Delhi, India

    39. Item Characteristics continued • Fidelity: Extent to which the format of the stem is consistent with how the situation would be encountered in a work setting. • High fidelity: Situation is conveyed through a short video. • Low fidelity: Situation presented in written form. SREQOM, Delhi, India

    40. Item Characteristics continued • Length: • Some stems are very short (How Supervise?, File & Remmers, 1971) • Other stems present very detailed descriptions of situations (Tacit Knowledge Inventory, Wagner & Sternberg, 1991) SREQOM, Delhi, India

    41. Item Characteristics continued • Complexity: Stems vary in the complexity of the situation presented. • Low complexity: One has difficulty with a new assignment and needs instructions. • High complexity: One has multiple supervisors, who are not cooperating with each other and who are providing conflicting instructions concerning which of your assignments has highest priority. SREQOM, Delhi, India

    42. Item Characteristics continued • Comprehensibility: It is more difficult to understand the meaning and import of some situations than other situations. • Sacco, Schmidt & Rogg (2000) examined the comprehensibility of item stems using reading formula. SREQOM, Delhi, India

    43. Item Characteristics continued • Length, complexity, and comprehensibility of the stems are interrelated and probably drive the cognitive loading of the items. SREQOM, Delhi, India

    44. Item Characteristics continued • Nested stems • Some situational judgment tests (Clevenger & Halland, 2000; Parker, Golden & Redmond, 2000) present an overall situation followed by subordinate situations. • Subordinate stems are the stems linked to the responses. SREQOM, Delhi, India

    45. Item CharacteristicsNature of Responses • Unlike item stems that vary widely in format, item responses are usually presented in a written format and are relatively short. • Even SJTs that use video to present the situation often present the responses in written form, sometimes accompanied by an audio presentation. SREQOM, Delhi, India

    46. Item CharacteristicsResponse Instructions • The various item instructions can be described in a two-dimensional taxonomy: (1) Behavioral tendency vs. Knowledge - how do you typically behave vs. what is the most effective response (2) Number of scorable responses SREQOM, Delhi, India

    47. Item CharacteristicsResponse Instructions SREQOM, Delhi, India

    48. Item Heterogeneity

    49. Item Heterogeneity • SJT items tends to be construct heterogeneous at the item level. • This means that they measure many things. • They are typically correlated with one or more of the following: • Cognitive ability • Agreeableness • Conscientiousness • Emotional stability SREQOM, Delhi, India

    50. SREQOM, Delhi, India