1 / 16

Job Analysis 651

Job Analysis 651. Chap 8 Staffing and Training. Staffing matching people to jobs. What are the implications for JA methods for armed services before/after July 3, 1973 for: Recruitment? Selection? Placement?. Recruitment. Purposes inform qualified applicants of job

Download Presentation

Job Analysis 651

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Job Analysis 651 Chap 8 Staffing and Training

  2. Staffingmatching people to jobs • What are the implications for JA methods for armed services before/after July 3, 1973 for: • Recruitment? • Selection? • Placement?

  3. Recruitment • Purposes inform qualified applicants of job • Entice qualified to apply (marketing function) • Inform potential applicants of requirements • RJP • Perfect recruitment • One person applies for the job

  4. Recruitment • Job specs • KSAOs • Duties • Context • Min quals / education and experience • Benefits

  5. Selection • To choose the best qualified • Test validation • For getting the cream of the crop or • Typical (average) performers • ? What are some jobs where it makes a difference? • What if the test has adverse impact? • ? What are the implications for promotion?

  6. Selection • Validity types: • Content • Criterion related • Face • Synthetic • What is it? • How is it related to VG? • Signs v. samples • When can samples be used? • Do you need SKAOs? • Simulations (can you build a model steam engine?)

  7. Judging KSAOs for Validation • Reliability sets limit on validity • More judges needed? • Validity of KSAO judgments • Table 8.2 (Trattner, Fine, & Kubis, ‘55) p235 • Better in predicting test scores • for mental and perceptual or physical aptitudes? • using job description or observation? • U.S. Air Force • Could psychologists and instructors • Predict relevant tests for training and aptitudes?

  8. McCormick (‘79) • PAQ to predict -> GATB • Predicting aptitude test scores • r s .61 to .83 for PAQ & mean scores(GVNSP & Clerical) • Less for GABT validity coefficients • r s .03 to .39 • Conclusion (for judging KSAOs): • Analysts can provided reliable and valid estimates of job requirements • Better at predicting mean test scores than validity of tests

  9. Selection: Key Considerations • Link attributes to tasks- • C-Jam • (tasks can be used for criterion development • CI (Flanagan) • Job element method (Primoff) • Generic traits • PAQ • TTAS (Threshold Traits Analysis System) • ABS (Ability Requirement Scales)

  10. Training • Tailor the person to fit the job • For a specific job • cost of training v. not training

  11. Training Cycle • Needs assessment (3 entities) • Organization (inter-related jobs) • Tasks & KSAOs (content) • People to be trained • E.g. for competencies? • How would you do this? • Gap analysis • Where is performance inadequate? • Person analysis • What do they bring/what do they need

  12. Training Cycle • T & Development design: • Who • What • When • Where • How much is available • Costs

  13. Training Cycle • Evaluation • Objectives • Context or stimulus situation • Behavioral requirements • Minimal acceptable response • Models • Individual difference (cf to a perf standard) • Experimental (control groups) • Content (ensure trn is related to KSAOs

  14. Training Cycle • Evlauation • Training Goals (Kirkpatrick) • Reaction (do they like it, think it’s worthwhile) • Learning (proficiency after training) • Behavior (performance on job – transfer) • Results (Org effectiveness) • ROI (added)

  15. Key Considerations • Content • Level of detail for tasks • Rating scales • importance / criticality • Consequence of error • Difficulty to learn • Frequency of occurrence • Location of training (OJT/ vestibule)

  16. Selection v. Training • Exist in applicant pool? • Quality of applicant pool? • Entry level or experienced • Geographical boundaries

More Related