response processes in sjt performance the role of general and specific knowledge n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Response Processes in SJT Performance: The Role of General and Specific Knowledge PowerPoint Presentation
Download Presentation
Response Processes in SJT Performance: The Role of General and Specific Knowledge

Loading in 2 Seconds...

play fullscreen
1 / 18

Response Processes in SJT Performance: The Role of General and Specific Knowledge - PowerPoint PPT Presentation


  • 60 Views
  • Uploaded on

Response Processes in SJT Performance: The Role of General and Specific Knowledge. 27 th Annual Conference of the Society for Industrial & Organizational Psychology April 27, 2012. Matthew T. Allen. James A. Grand. Kenneth Pearlman.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Response Processes in SJT Performance: The Role of General and Specific Knowledge' - margot


Download Now An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
response processes in sjt performance the role of general and specific knowledge

Response Processes in SJT Performance:The Role of General and Specific Knowledge

27th Annual Conference of the Society for Industrial & Organizational Psychology

April 27, 2012

Matthew T. Allen

James A. Grand

Kenneth Pearlman

The views, opinions, and/or findings contained in this presentation are solely those of the authors and should not be construed as an official Department of the Army or Department of Defense position, policy, or decision, unless so designated by other documentation.

presentation overview

Turning a critical eye towards SJT construct validity & its assumptions

  • What do SJTs measure?
    • An alternative take on an old question
    • A response process model for SJT performance
  • Predictions of the response process model
    • Empirical support?
  • Implications for interpretation & validity of SJTs
Presentation Overview

Response Processes in

SJT Performance

acknowledgement
Acknowledgement
  • The authors would like to thank the U.S. Army Research Institute for the Behavioral and Social Sciences (ARI) for allowing the use of their data for this research
  • More information about the study can be found in:

Knapp, D. J., McCloy, R. A., & Heffner, T. S. (Eds.) (2004). Validation of measures designed to Maximize 21st-century Army NCO performance (TR 1145). Arlington, VA: U.S. Army Research Institute for the Behavioral and Social Sciences.

validity evidence for sjts
Validity Evidence for SJTs
  • Established evidence of criterion validity between SJT and job performance (cf., Chan & Schmitt, 2002; Clevenger et al., 2001; McDaniel et al., 2001)
    • Estimates in low .20s(corrected validities near mid-.30s) (Chan & Schmitt, 2005)
  • Agreement on construct validity is less certain...

First-order Constructs

Second-order Constructs

“Practical Intelligence”

  • Multiple, distinguishable dimensions
  • Specific a priori subscales
  • Singular, high-level dimension
  • Broad focal target
  • Tacit knowledge or “common sense”
  • Everyday reasoning
  • Oswald et al. (2004)
    • Career orientation
    • Perseverance
    • Multicultural appreciation
  • Lievens et al. (2005)
    • Interpersonal skills
  • Mumford et al. (2008)
    • Team role knowledge
  • Sternberg et al. (2002)
    • “Practical know-how”
  • Chan & Schmitt (2005)
    • Contextual knowledge
a test in name only
A Test in Name Only
  • Virtually all perspectives approach and treat SJT measurement in a manner consistent with Classical Test Theory
  • SJTs are NOTtests! (at least in the traditional sense of the word)
    • Low-fidelity simulations(Motowidlo et al., 1990)
    • Measurement methods capable of capturing a variety of constructs (Chan & Schmitt, 2005; McDaniel & Nguyen, 2001)

X = T + E

Observed Score = True Score + Error

so what do sjts measure
So What Do SJTs Measure?

“So far, there does not exist any theory about how people answer SJTs or about what makes some SJT items more or less difficult than others.”

(p. 1044, Lievens& Sackett, 2007)

“SJT performance clearly involves cognitive processes. [...] Addressing basic questions about these underlying cognitive processes and eventually understanding them could provide the key to explicating constructs measured by SJTs”

(Chan & Schmitt, 2005)

so what do sjts measure1
So What Do SJTs Measure?
  • Rather than conceptualize SJTs as though they measure a static construct or “true score,” SJTs capture sophistication of a respondent’s reasoning process
  • By their nature, SJTs capture similarity between respondent reasoning and that implied by keyed responses

?

a response process model of sjt performance
A Response Process Model of SJT Performance
  • Generic dual-process accounts of human reasoning, judgment, and decision-making (Evans, 2008)
  • System 1
  • Implicit, intuitive, and automatic reasoning
  • Decisions guided by general heuristics , which are informed by domain experiences
  • High capacity, low effort processing
  • System 2
  • Systematic, rational, and analytic reasoning
  • Decisions guided by controlled, rule-based evaluations and conscious reflection
  • Low capacity, high effort processing
a response process model of sjt performance1
A Response Process Model of SJT Performance
  • Dual-process accounts have been applied in a variety of perceptual, reasoning, and decision-making tasks (see Evans, 2008)
    • Extensions of dual-process model serve as foundation for much of judgment & decision-making literature (e.g., Gigerenzer et al., 1999; Kahneman & Frederick, 2002, 2005)
  • Central Tenets of Dual-Process Models
  • Because of limits on our cognitive capacity and information processing...
  • System 1 reasoning is primary determinant of judgment/decision-making in most situations
  • System 2 reasoning is typically engaged to evaluate the quality of decisions or in attempts to consciously contrast alternatives
dual process accounts as a response process model of sjt performance
Dual-Process Accounts as a Response Process Model of SJT Performance
  • Two predictions based on dual-process account relative to SJT performance:
    • Beliefs about the general effectiveness of various behaviors, dispositions, or approaches serve as baseline heuristic for reasoning across many situations (cf., Motowidlo et al., 2006; Motowidlo & Beier, 2010)
      • “It is good to be thorough and conscientious in one’s work.”
    • Domain experience/knowledge leads to development of more conditional, refined, and nuanced heuristics(Hunt, 1994; Klein, 1998; Phillips et al., 2004)
      • “It is good to be thorough and conscientious in one’s work, but you can generally skimp on Task X and still do just fine.”
    • Thus, generalized heuristics/beliefs/temperaments become less predictive of SJT performance as experience increases

As job experience increases, the predictive validity of domain-general heuristics on SJT performance will decrease

1

dual process accounts as a response process model of sjt performance1
Dual-Process Accounts as a Response Process Model of SJT Performance
  • Two predictions based on dual-process account relative to SJT performance:
    • Common for respondents to identify/rate most and least effective/preferred SJT response options (McDaniel & Nguyen, 2001)
    • Identifying most effective option should engage System 1 reasoning
      • Select most reasonable option based on intuitive heuristic, less effortful processing
    • Identifying least/less effective option should engage System 2 reasoning
      • “Play out”/evaluate consequences of remaining options, more effortful processing
    • Thus, identifying least/less effective option more g-loaded than identifying most/more effective option

Cognitive ability will be more strongly related to assessment of less preferred SJTs options than more preferred options

2

methods measurement
Methods & Measurement
  • Concurrent validation study on predictors of current and future expected job performance of Army NCOs (n = 1,838) (Knapp et al., 2004)
    • Primarily interested in predicting leadership performance/potential
    • Sample:
methods measurement1

Domain-general heuristic measures

    • Differential attractiveness:individuals who more strongly endorse a trait/quality perceive behaviors which reflect that trait/quality as more effective (Motowidlo et al., 2006; Motowidlo & Beier, 2010)
    • Temperament inventories
      • Assessment of Individual Motivation (AIM)
        • Multidimensional 38-item forced choice measure (α ≈ .60 all scales)
      • Biographical Information Questionnaire (BIQ)
        • Multidimensional 156-item self-report biodataquestionnaire (α ≈ .70 all scales)
  • General cognitive aptitude (ASVAB)
  • 40-item SJT on leadership/interpersonal skills (Knapp et al., 2002)
    • 5 response alternatives, SMEs rated all options
    • Respondents chose most & least effective options
      • Responses recoded to SME ratings
Methods & Measurement

Response Processes in

SJT Performance

empirical examination of predictions from dual process model
Empirical Examination of Predictions from Dual Process Model

As job experience increases, the predictive validity of domain-general heuristics on SJT performance will decrease

1

  • Regression Summary
  • Main effect of temperament
  • Main effect of experience
  • Significant interaction
    • Relationship stronger for less experienced

Results consistent across

all scales & SJT scores

empirical examination of predictions from dual process model1
Empirical Examination of Predictions from Dual Process Model

As job experience increases, the predictive validity of domain-general heuristics on SJT performance will decrease

1

Interpersonal Skill

Leadership

Leadership

Physical Conditioning

AIM Scales

Openness

Agreeableness

Tolerance for Ambiguity

Work Orientation

Social Maturity

Adjustment

BIQ Scales

Social Perceptiveness

Dependability

empirical examination of predictions from dual process model2
Empirical Examination of Predictions from Dual Process Model

Cognitive ability will be more strongly related to assessment of less preferred SJTs options than more preferred options

2

conclusions implications
Conclusions & Implications
  • Most research on SJT measurement, development, and validity has largely been atheoretical (but see Motowidlo& Beier, 2010)
    • Dual-process account appears to be a reasonable response process model
    • Currently working on more explicit empirical examination (seealso Foldes et al., 2010)
  • What does having a response process model buy us?
    • SJT construct validity: Constructs vs. Reasoning
      • Could label it “practical intelligence,” but even that depends on...
    • Interpretation of SJT performance
      • Who is selected as the “experts” holds significant importance
      • Extent to which respondents reason/process information in a manner similar to “experts”
    • Response elicitation affects SJT interpretation
      • Most likely option/ratings = more heavily influenced by heuristic reasoning
      • Least likely option/ratings = more heavily influenced by cognitive reasoning
response processes in sjt performance the role of general and specific knowledge1

Response Processes in SJT Performance:The Role of General and Specific Knowledge

27th Annual Conference of the Society for Industrial & Organizational Psychology

April 27, 2012

Matthew T. Allen

James A. Grand

Kenneth Pearlman

The views, opinions, and/or findings contained in this presentation are solely those of the authors and should not be construed as an official Department of the Army or Department of Defense position, policy, or decision, unless so designated by other documentation.