160 likes | 271 Views
This analysis explores the implications of job analysis methods for recruitment, selection, and placement in the armed services following July 3, 1973. It examines the recruitment process's dual purpose of attracting qualified candidates and defining job requirements based on KSAOs (Knowledge, Skills, Abilities, and Other characteristics). The selection process is evaluated for its effectiveness in choosing the best candidates while considering validity types and potential adverse impacts. Training considerations focus on tailored approaches and the training cycle's importance, ensuring participants are equipped to meet performance standards.
E N D
Job Analysis 651 Chap 8 Staffing and Training
Staffingmatching people to jobs • What are the implications for JA methods for armed services before/after July 3, 1973 for: • Recruitment? • Selection? • Placement?
Recruitment • Purposes inform qualified applicants of job • Entice qualified to apply (marketing function) • Inform potential applicants of requirements • RJP • Perfect recruitment • One person applies for the job
Recruitment • Job specs • KSAOs • Duties • Context • Min quals / education and experience • Benefits
Selection • To choose the best qualified • Test validation • For getting the cream of the crop or • Typical (average) performers • ? What are some jobs where it makes a difference? • What if the test has adverse impact? • ? What are the implications for promotion?
Selection • Validity types: • Content • Criterion related • Face • Synthetic • What is it? • How is it related to VG? • Signs v. samples • When can samples be used? • Do you need SKAOs? • Simulations (can you build a model steam engine?)
Judging KSAOs for Validation • Reliability sets limit on validity • More judges needed? • Validity of KSAO judgments • Table 8.2 (Trattner, Fine, & Kubis, ‘55) p235 • Better in predicting test scores • for mental and perceptual or physical aptitudes? • using job description or observation? • U.S. Air Force • Could psychologists and instructors • Predict relevant tests for training and aptitudes?
McCormick (‘79) • PAQ to predict -> GATB • Predicting aptitude test scores • r s .61 to .83 for PAQ & mean scores(GVNSP & Clerical) • Less for GABT validity coefficients • r s .03 to .39 • Conclusion (for judging KSAOs): • Analysts can provided reliable and valid estimates of job requirements • Better at predicting mean test scores than validity of tests
Selection: Key Considerations • Link attributes to tasks- • C-Jam • (tasks can be used for criterion development • CI (Flanagan) • Job element method (Primoff) • Generic traits • PAQ • TTAS (Threshold Traits Analysis System) • ABS (Ability Requirement Scales)
Training • Tailor the person to fit the job • For a specific job • cost of training v. not training
Training Cycle • Needs assessment (3 entities) • Organization (inter-related jobs) • Tasks & KSAOs (content) • People to be trained • E.g. for competencies? • How would you do this? • Gap analysis • Where is performance inadequate? • Person analysis • What do they bring/what do they need
Training Cycle • T & Development design: • Who • What • When • Where • How much is available • Costs
Training Cycle • Evaluation • Objectives • Context or stimulus situation • Behavioral requirements • Minimal acceptable response • Models • Individual difference (cf to a perf standard) • Experimental (control groups) • Content (ensure trn is related to KSAOs
Training Cycle • Evlauation • Training Goals (Kirkpatrick) • Reaction (do they like it, think it’s worthwhile) • Learning (proficiency after training) • Behavior (performance on job – transfer) • Results (Org effectiveness) • ROI (added)
Key Considerations • Content • Level of detail for tasks • Rating scales • importance / criticality • Consequence of error • Difficulty to learn • Frequency of occurrence • Location of training (OJT/ vestibule)
Selection v. Training • Exist in applicant pool? • Quality of applicant pool? • Entry level or experienced • Geographical boundaries