1 / 31

Unit 2: PSY 6450

Unit 2: PSY 6450. Traditional Performance Appraisal Performance Measurement Performance Assessment Task Clarification. Unit 2. Description of performance measurement project Unit exam over study objectives Wed., 9/30 Muchinsky: traditional performance appraisal

mio
Download Presentation

Unit 2: PSY 6450

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Unit 2: PSY 6450 Traditional Performance Appraisal Performance Measurement Performance Assessment Task Clarification

  2. Unit 2 • Description of performance measurement project • Unit exam over study objectives Wed., 9/30 • Muchinsky: traditional performance appraisal • Daniels chapter: behavioral measurement & performance matrix • Pampino et al. article - great example of use of performance assessment, multiple component intervention, and social validity assessment. Only problem is short-term nature of the study • Anderson et al. article – great example of task clarification and importance of feedback • Crowell et al. article – great example of task clarification, importance of feedback, and importance of praise (I will be away on Wed. 9/30; your exam will be here; on Monday 9/28, I will lecture on U3, motivation)

  3. Additional Readings in course pack, NFE • LaFleur et al. – excellent example of using Gilbert’s BEM as an assessment instrument and an exquisite article overall • Doll et al. – excellent example of using Daniels’ PIC/NIC as an assessment instrument • Squires et al. – excellent comparison of task clarification, verbal prompts, and graphic feedback • Major Take-Home Points – the World According to Dr. Dickinson • NFE, but hopefully ties the material together for you

  4. Performance Measurement Project • Proposal due Monday, Oct. 19 (35 pts) • Final project due Monday, Dec. 14 (35 pts) • See syllabus for policy on lateness • Two options • Develop and implement a measurement system for a minimum of 4 weeks • If measures already exist and you can obtain those measures from archive for 4 weeks, implement an intervention (Final project due Monday, 12/7 if grade before ME2)

  5. Project description • You may work in groups or by yourself • If you work in a group, each individual will receive the same grade - can’t separate individual contributions • Job • One or more workers: one is OK • Full-time or part-time • A friend, relative, significant other, another student, someone you supervise: only restriction - not one for yourself • Business or educational setting (i.e., PSY 3600 lab instructors or teachers, PSY 1000 assistants, servers) • Advise against secretaries and programmers

  6. Some basic rules related to employment jeopardy • The emphasis will be positive - I will not accept a project in which the objective is to document poor performance • You must inform the worker or workers whose performance will be measured; if possible, give employees an option • If the supervisor or manager will see the data, you must insure that he/she will respond to it positively, not negatively • If supervisor responds negatively and you and I cannot resolve that, then I will have you terminate the project

  7. HSIRB issues • Research: a systematic investigation that contributes to generalizable knowledge • Projects in this class do not constitute research, however, they could (more later) • HSIRB class registration of projects • I have registered the class with the HSIRB and gotten initial approval for the projects • After the proposal, but before you start the project I will send titles to HSIRB for final approval • You cannot implement your project until I get this final approval!!

  8. Why aren’t the projects research? • The data will only be used in-class or internally by the organization: thus do not contribute to generalizable knowledge • If you use this as a pilot for your thesis or present the data publicly, then the project becomes research: Why else would you do that if the data don’t contribute to generalizable knowledge? • Confidentiality issues arise when projects are publicly presented: cannot violate a participant’s right to confidentiality You must submit a full HSIRB protocol if you use this as a pilot study for your thesis or present it publicly!!! (and before you can do that, on-line training; sample of a full protocol in course pack)

  9. Why aren’t the projects research? (cont.) • In OBM, you are being invited by management to implement a management system • Organizations have the right to implement and evaluate management systems as part of their normal business practice • They can require all employees to participate in that system so you do not have to get consent from the employees (you do need a letter of consent from management) • You are not a researcher who is implementing the intervention, you are implementing or evaluating the system on behalf of management (consultant, agent) • With the caveat that data will only be used internally

  10. Ethical questions to ask yourself • Can the employees be harmed by the data you collect - are you placing them in employment jeopardy? • Group vs. individual data? (Ok to use individual data) • If individual data, can the individuals be identified? • Are you using the least restrictive and least intrusive procedure that is effective? (aversiveness) • Publicly posted individual data with identifiers vs. group data or individual data that are coded • Confidentiality of the employees • Should not discuss performance of individuals with identifying information with anyone outside the organization • Proposal and final report to me: do NOT include the name of the organization or employees

  11. Performance Measures • Objective • System may cover: • 2-3 measures that need to be improved • Measures that target only the critical performances • Measures that cover all of the major responsibilities • Must be repetitive - daily or weekly measures • Verifiable - if you ask the worker to self-record you must develop a system to verify accuracy • Strongly recommend accomplishments rather than behaviors • Behaviors require you to be observe the person while working - very labor and time intensive (biggest problem in past) (in syllabus, mention a book by Abernathy provides measures)

  12. Measurement system and forms • Measurement system • What will be measured and why the measure is important • Who will measure? • When will measurement take place? • How frequently will measures be taken? • Forms in proposal! (trouble with this in the past) • Include all measurement forms • Include a sample graph or data summary - how are you going to present the data?

  13. Proposal format and grading • Both format and grading are discussed in the project description • I will deduct points for a sloppy proposal/measurement system and will require you to rewrite your proposal • I may ask you to answer questions or revise your proposal/measurement without deducting points • A sample proposal and sample final project reports are included in the course pack in this unit • Strongly recommend you get started now - proposals are due in 4 weeks • Yes, I will take them early, if you want to get started early QUESTIONS??

  14. Traditional Performance Appraisal: Muchinsky • SO1: Characteristics of performance appraisals that affect the way the courts rule • Performance appraisals are subject to federal and state EEO laws, and more and more cases are being filed. • Protected classes under Title VII of the Civil Rights Act of 1964? African Americans, Hispanics, Asians, Native Americans, and females. (on click)

  15. SO1 Five characteristics that affect the way courts rule • Should be objective (NFE, no brainer) • Should be job related, preferably based on a written job analysis (NFE, no brainer) • Should be based on behaviors, not traits • Research shows no difference between trait-based forms and behavior-based forms, but courts think they are better. (problem is still subjectivity) • Performance should be under the control of the employee • Should relate to specific functions of the job, not global assessments • Again, research suggests this really doesn’t matter, but courts think it matters (#2 could be a problem for a behavioral measurement system if used for personnel decisions and no job analysis)

  16. Common types of appraisal methods (NFE) • Graphic rating scales, most common • Employee comparison methods • Evaluate employees by comparing them against each other rather than against a standard • Behavioral checklists and forms

  17. SO3A: Employee comparisons vs. graphic rating scales • Advantage of employee comparison over graphic rating scales? • Forces variance or variability into the ratings • With graphic scales, not uncommon for a supervisor to rate employees pretty much the same: everyone is a 5 or a 6 on a 7 point scale, making it difficult to make personnel decisions. (rank order, paired comparison, and forced distribution; both points need to be included)

  18. SO3B: What’s the problem with employee comparisons? • May create a false impression that differences between individuals are actually larger than they are • Six employees, rank ordered • Jan • Shakira • Paul • Mike • Susan • John Actual performance difference between Jan and Mike or even Susan may be very small, yet look big. Can affect salary increases, promotions, etc. (limited $ for merit increases, promotions - across depts – Is #3 in one dept. comparable in performance to #3 in another? Also, competitive systems; only one person can be #1; sabotage or at least decrease willingness to cooperate/help )

  19. Dissatisfaction with performance appraisal • Companies tend to switch frequently from one type of form to another • Recall performance appraisal was the third-ranked topic addressed in JAP • IBM abandoned employee ranking in favor of graphic scales, then returned to ranking a few years ago • Made the front page of the Wall Street Journal: ranking was THE way to go • Why do you think IBM may have made the switch? (nothing to do with reliability or validity)

  20. SO4: Behavioral checklists • Behavioral checklists and scales are assumed to be more accurate because they are less vague, and target “objective” behaviors that are readily observable but they are not more accurate • Why? • Still are based on subjective judgments • Completed once a year • That is also why we should not consider them adequate from a behavioral perspective • Objective measures of behavior/performance • Over time as behavior/performance occurs on the job (aubrey daniels, problems, do it 1/2 as often; only 5% of variance between individuals when rated is due to the type of performance appraisal – That means, the type of performance appraisal format really does not matter; one form is not better than another, people just think they are)

  21. SO6: Main problem with performance appraisals (NFE) • Inflation of ratings • Main assumption about the cause of errors • Design of the form (hence why so much time and money is spent redesigning forms - “design out” rating errors • Lack of knowledge on the part of supervisors/managers - training • Rarely are the consequences for accurate ratings examined or considered (which leads to SO7) (Important lead-in to SO7, Director of Personnel, WMU, ranks among the worst problems she had - fire/good ratings)

  22. SO7: Consequences for inflated ratings (learn 3, name of principle for 4) • No rewards for accuracy; no or few sanctions for inaccurate ratings (how does the organization know if the ratings are accurate or inaccurate - a problem) • Most common reason - high ratings are necessary to get salary increases, promotions and other rewards for employees • Ratings of subordinates reflect their competence as a supervisor/manager

  23. 4th consequence • Negative evaluations result in defensive reactions and hostile reactions from employees • Principle of behavior? Straightforward avoidance: In past: Bad rating ––––> Sp aversive social interaction CS ––––––> CR stress Current: Good rating –––––> Sr-, no aversive social interaction No CS to elicit CR of stress (on click;learn this diagram FE; Sp = conditioned punisher, S=stimulus, p=punisher, no doubt also Sr+, CSfeeling good)

  24. SOs 8&9: Peer assessments • Peer assessments are very accurate - high reliability and high validity • Reliability - different people rate the employee the same way; Interobserver agreement • Validity - related to job performance • Main problem? (not bias problem, one perceived issue) • Acceptance by employees - we don’t like to do this • WMU faculty merit system: Union vs. management • Not enough money to go around; many just split $$ • How would you like to be given the responsibility for evaluating your peers - the other students advised by your faculty advisor - and have rewards (i.e.,assistantships, grades in classes, opportunities for practica and projects) based on that? (we may not know how well we are performing – self assessments next, but we know how well our peers are performing!)

  25. SO10A, Self-assessment • SO10A: Main problem? • Inflation: we think we are better than we are • Engineer example: 92 engineers rated their performance in comparison to other engineers. On average, each engineer rated his performance as better than 75% of the other engineers. Quite a feat to have 100% of your employees in the top 25%!

  26. SO10B Self-assessments, cont. • SO10B: Thornton - little agreement between supervisory and self-assessments. What are the very important implications of this from a behavioral perspective? In behavior analysis we always strive to make rewards contingent upon performance. The better the performance, the more rewards. If, however, employees believe they are performing better than the supervisor believes, then the supervisor will not give them the rewards that they feel they should be getting. That is, employees will not believe that rewards are truly contingent upon their performance. Likely to hurt performance and cause dissatisfaction.

  27. SO11: 360 degree feedback: hot, hot, hot • A manager/supervisor • Rates himself • Is rated by his manager/supervisor • Is rated by his subordinates • SO11A: What percentage of managers saw themselves as others saw them? Only 10%! • SO11B: Overrating was most noteworty on what scale? People (NFE, but why is this good news for us?; Muchinsky – excellent discussion of problems with this type of system when it is used for administrative purposes, which is increasing)

  28. SO12: Credibility & Power – Ilgen, Fisher and Taylor • Credibility: What influences credibility the most from a behavioral perspective? (assuming manager has expertise) • Measurement system and its objectivity • Judi Komaki - work sampling separates highly effective managers from ineffective ones (actually objectively sampling and looking at work) • If the supervisor is not evaluating performance accurately, the feedback becomes meaningless • Daniels “In God we trust, all else bring data!!” (Classic article – you should read it. This is important -, the extent to which people will change their performance when given fb from supv depends upon credibility and power, my experience, PA, nice guy mgr)

  29. SO12B: Power and feedback • Power: The extent to which a supervisor has control over valued rewards • Why does power influence the extent to which an employee will be influenced by a supervisor’s feedback? (this is the question) Feedback is not a basic principle of behavior. It is just a stimulus. It will be a neutral stimulus unless it is paired with valued consequences. When it affects behavior, it most likely does so as an SD or Sr (or an analog). But, in both cases to become an SD or Sr, it must be paired with valued consequences. Thus, if feedback is not paired with valued consequences (the supv. has no control over them), it will not affect behavior. (excellent point from a behavioral perspective, students have trouble with this, a nice behavioral analysis)

  30. For your entertainment only: Actual statements • His men would follow him anywhere, but out of morbid curiosity • I would not allow this employee to breed • Works well when under constant supervision and cornered like a rat in a trap • He would be out of his depth in a parking lot puddle • This young lady has delusions of adequacy • This employee is depriving a village somewhere of an idiot

  31. End Part 1 • Any questions so far?

More Related