1 / 18

Response Processes in SJT Performance: The Role of General and Specific Knowledge

Response Processes in SJT Performance: The Role of General and Specific Knowledge. 27 th Annual Conference of the Society for Industrial & Organizational Psychology April 27, 2012. Matthew T. Allen. James A. Grand. Kenneth Pearlman.

margot
Download Presentation

Response Processes in SJT Performance: The Role of General and Specific Knowledge

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Response Processes in SJT Performance:The Role of General and Specific Knowledge 27th Annual Conference of the Society for Industrial & Organizational Psychology April 27, 2012 Matthew T. Allen James A. Grand Kenneth Pearlman The views, opinions, and/or findings contained in this presentation are solely those of the authors and should not be construed as an official Department of the Army or Department of Defense position, policy, or decision, unless so designated by other documentation.

  2. Turning a critical eye towards SJT construct validity & its assumptions • What do SJTs measure? • An alternative take on an old question • A response process model for SJT performance • Predictions of the response process model • Empirical support? • Implications for interpretation & validity of SJTs Presentation Overview Response Processes in SJT Performance

  3. Acknowledgement • The authors would like to thank the U.S. Army Research Institute for the Behavioral and Social Sciences (ARI) for allowing the use of their data for this research • More information about the study can be found in: Knapp, D. J., McCloy, R. A., & Heffner, T. S. (Eds.) (2004). Validation of measures designed to Maximize 21st-century Army NCO performance (TR 1145). Arlington, VA: U.S. Army Research Institute for the Behavioral and Social Sciences.

  4. Validity Evidence for SJTs • Established evidence of criterion validity between SJT and job performance (cf., Chan & Schmitt, 2002; Clevenger et al., 2001; McDaniel et al., 2001) • Estimates in low .20s(corrected validities near mid-.30s) (Chan & Schmitt, 2005) • Agreement on construct validity is less certain... First-order Constructs Second-order Constructs “Practical Intelligence” • Multiple, distinguishable dimensions • Specific a priori subscales • Singular, high-level dimension • Broad focal target • Tacit knowledge or “common sense” • Everyday reasoning • Oswald et al. (2004) • Career orientation • Perseverance • Multicultural appreciation • Lievens et al. (2005) • Interpersonal skills • Mumford et al. (2008) • Team role knowledge • Sternberg et al. (2002) • “Practical know-how” • Chan & Schmitt (2005) • Contextual knowledge

  5. A Test in Name Only • Virtually all perspectives approach and treat SJT measurement in a manner consistent with Classical Test Theory • SJTs are NOTtests! (at least in the traditional sense of the word) • Low-fidelity simulations(Motowidlo et al., 1990) • Measurement methods capable of capturing a variety of constructs (Chan & Schmitt, 2005; McDaniel & Nguyen, 2001) X = T + E Observed Score = True Score + Error

  6. So What Do SJTs Measure? “So far, there does not exist any theory about how people answer SJTs or about what makes some SJT items more or less difficult than others.” (p. 1044, Lievens& Sackett, 2007) “SJT performance clearly involves cognitive processes. [...] Addressing basic questions about these underlying cognitive processes and eventually understanding them could provide the key to explicating constructs measured by SJTs” (Chan & Schmitt, 2005)

  7. So What Do SJTs Measure? • Rather than conceptualize SJTs as though they measure a static construct or “true score,” SJTs capture sophistication of a respondent’s reasoning process • By their nature, SJTs capture similarity between respondent reasoning and that implied by keyed responses ≠ ?

  8. A Response Process Model of SJT Performance • Generic dual-process accounts of human reasoning, judgment, and decision-making (Evans, 2008) • System 1 • Implicit, intuitive, and automatic reasoning • Decisions guided by general heuristics , which are informed by domain experiences • High capacity, low effort processing • System 2 • Systematic, rational, and analytic reasoning • Decisions guided by controlled, rule-based evaluations and conscious reflection • Low capacity, high effort processing

  9. A Response Process Model of SJT Performance • Dual-process accounts have been applied in a variety of perceptual, reasoning, and decision-making tasks (see Evans, 2008) • Extensions of dual-process model serve as foundation for much of judgment & decision-making literature (e.g., Gigerenzer et al., 1999; Kahneman & Frederick, 2002, 2005) • Central Tenets of Dual-Process Models • Because of limits on our cognitive capacity and information processing... • System 1 reasoning is primary determinant of judgment/decision-making in most situations • System 2 reasoning is typically engaged to evaluate the quality of decisions or in attempts to consciously contrast alternatives

  10. Dual-Process Accounts as a Response Process Model of SJT Performance • Two predictions based on dual-process account relative to SJT performance: • Beliefs about the general effectiveness of various behaviors, dispositions, or approaches serve as baseline heuristic for reasoning across many situations (cf., Motowidlo et al., 2006; Motowidlo & Beier, 2010) • “It is good to be thorough and conscientious in one’s work.” • Domain experience/knowledge leads to development of more conditional, refined, and nuanced heuristics(Hunt, 1994; Klein, 1998; Phillips et al., 2004) • “It is good to be thorough and conscientious in one’s work, but you can generally skimp on Task X and still do just fine.” • Thus, generalized heuristics/beliefs/temperaments become less predictive of SJT performance as experience increases As job experience increases, the predictive validity of domain-general heuristics on SJT performance will decrease 1

  11. Dual-Process Accounts as a Response Process Model of SJT Performance • Two predictions based on dual-process account relative to SJT performance: • Common for respondents to identify/rate most and least effective/preferred SJT response options (McDaniel & Nguyen, 2001) • Identifying most effective option should engage System 1 reasoning • Select most reasonable option based on intuitive heuristic, less effortful processing • Identifying least/less effective option should engage System 2 reasoning • “Play out”/evaluate consequences of remaining options, more effortful processing • Thus, identifying least/less effective option more g-loaded than identifying most/more effective option Cognitive ability will be more strongly related to assessment of less preferred SJTs options than more preferred options 2

  12. Methods & Measurement • Concurrent validation study on predictors of current and future expected job performance of Army NCOs (n = 1,838) (Knapp et al., 2004) • Primarily interested in predicting leadership performance/potential • Sample:

  13. Domain-general heuristic measures • Differential attractiveness:individuals who more strongly endorse a trait/quality perceive behaviors which reflect that trait/quality as more effective (Motowidlo et al., 2006; Motowidlo & Beier, 2010) • Temperament inventories • Assessment of Individual Motivation (AIM) • Multidimensional 38-item forced choice measure (α ≈ .60 all scales) • Biographical Information Questionnaire (BIQ) • Multidimensional 156-item self-report biodataquestionnaire (α ≈ .70 all scales) • General cognitive aptitude (ASVAB) • 40-item SJT on leadership/interpersonal skills (Knapp et al., 2002) • 5 response alternatives, SMEs rated all options • Respondents chose most & least effective options • Responses recoded to SME ratings Methods & Measurement Response Processes in SJT Performance

  14. Empirical Examination of Predictions from Dual Process Model As job experience increases, the predictive validity of domain-general heuristics on SJT performance will decrease 1 • Regression Summary • Main effect of temperament • Main effect of experience • Significant interaction • Relationship stronger for less experienced Results consistent across all scales & SJT scores

  15. Empirical Examination of Predictions from Dual Process Model As job experience increases, the predictive validity of domain-general heuristics on SJT performance will decrease 1 Interpersonal Skill Leadership Leadership Physical Conditioning AIM Scales Openness Agreeableness Tolerance for Ambiguity Work Orientation Social Maturity Adjustment BIQ Scales Social Perceptiveness Dependability

  16. Empirical Examination of Predictions from Dual Process Model Cognitive ability will be more strongly related to assessment of less preferred SJTs options than more preferred options 2

  17. Conclusions & Implications • Most research on SJT measurement, development, and validity has largely been atheoretical (but see Motowidlo& Beier, 2010) • Dual-process account appears to be a reasonable response process model • Currently working on more explicit empirical examination (seealso Foldes et al., 2010) • What does having a response process model buy us? • SJT construct validity: Constructs vs. Reasoning • Could label it “practical intelligence,” but even that depends on... • Interpretation of SJT performance • Who is selected as the “experts” holds significant importance • Extent to which respondents reason/process information in a manner similar to “experts” • Response elicitation affects SJT interpretation • Most likely option/ratings = more heavily influenced by heuristic reasoning • Least likely option/ratings = more heavily influenced by cognitive reasoning

  18. Response Processes in SJT Performance:The Role of General and Specific Knowledge 27th Annual Conference of the Society for Industrial & Organizational Psychology April 27, 2012 Matthew T. Allen James A. Grand Kenneth Pearlman The views, opinions, and/or findings contained in this presentation are solely those of the authors and should not be construed as an official Department of the Army or Department of Defense position, policy, or decision, unless so designated by other documentation.

More Related