slide1
Download
Skip this Video
Download Presentation
PSYCHOMETRIC TESTING WITHIN THE SANDF PRESENTATATION TO THE PORTFOLIO COMMITTEE ON DEFENCE 19 JUNE 2007

Loading in 2 Seconds...

play fullscreen
1 / 220

psychometric testing within the sandf presentatation to the portfolio committee on defence 19 june 2007 - PowerPoint PPT Presentation


  • 450 Views
  • Uploaded on

PSYCHOMETRIC TESTING WITHIN THE SANDF PRESENTATATION TO THE PORTFOLIO COMMITTEE ON DEFENCE 19 JUNE 2007. SCOPE. Statutory Control of the use of Psychological Assessment Measures within the SANDF. The Psychometric Tests used within the SANDF.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'psychometric testing within the sandf presentatation to the portfolio committee on defence 19 june 2007' - Pat_Xavi


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

PSYCHOMETRIC TESTING WITHIN THE SANDFPRESENTATATION TO THE PORTFOLIO COMMITTEE ON DEFENCE19 JUNE 2007

scope
SCOPE
  • Statutory Control of the use of Psychological Assessment Measures within the SANDF.
  • The Psychometric Tests used within the SANDF.
  • The Role and Function of Assessment Centres within the SANDF.
  • The Application of Specialist Psychological Measures in the Recruitment and Selection of Pilots within the SANDF.
scope 1
SCOPE (1)
  • Terminology
  • Characteristics of Assessment Measures
  • The Need for Control of Assessment Measures within the SANDF
  • Control of Psychological Assessment Measures within the SANDF
  • Psychological Assessment within the Democratic South Africa
  • Fair and Ethical Practices in the Use of Psychological Measures within the SANDF
scope 2
SCOPE (2)
  • Factors Affecting Psychological Assessment Results
  • Professional Practices that Assessment Practitioners within the SANDF should follow
  • Basis Statistical Concepts: Reliability, Validity and Norms
  • Challenges faced by Psychological Assessment Practitioners within the SANDF
terminology 1
TERMINOLOGY (1)
  • Importantant Terms
    • Confusing and overlapping terms are used in the field of psychological assessment.
    • Understand the more important terms and how they are interlinked.
    • Tools are available to make it possible for us to assess (measure) human behaviour.
    • Various names are used to refer to these tools such as tests, measures, assessment measures, instruments, scales, procedures, and techniques.
terminology 2
TERMINOLOGY (2)
    • To ensure that psychological measurement is valid and reliable, a body of theory and research regarding the scientific measurement principles that are applied to the measurement of psychological characteristics has evolved over time.
  • Psychometrics
    • Refers to the systematic and scientific way in which psychological measures are developed and the technicalmeasurement standards (e.g. validity and reliability) required of measures.
terminology 3
TERMINOLOGY (3)
  • Psychological Assessment
    • A process-orientated activity aimed at gathering a wide array of information by using assessment measures(tests) and information from many other sources (e.g. interviews, a person’s history, collateral sources).
    • Evaluate and integrate all information to reach a conclusion or make a decision.
  • Testing
    • The use of tests, measures, etc. which involves the measurement of behaviour, is one of the key elements of the much broader evaluative process known as psychological assessment.
terminology 4
TERMINOLOGY (4)
  • Assessment Measure
    • In the SANDF preference is given to the term assessment measure as it is a broader connotation than the term test, which mainly refers to an objective, standardised measure that is used to gather data for a specific purpose (e.g. to determine what a person’s intellectual capacity is).
characteristics of assessment measures 1
CHARACTERISTICS OF ASSESSMENT MEASURES (1)
  • Different Procedures
    • Assessment measures include many different procedures that can be used in psychological assessment and can be administered to individuals, groups and organisations.
  • Domains of Functioning
    • Specific domains of functioning (e.g. intellectual ability, personality, organisational climate) are sampled by assessment measures.
  • Standardised Conditions
    • Assessment measures are administered under carefully controlled (standardised) conditions.
characteristics of assessment measures 2
CHARACTERISTICS OF ASSESSMENT MEASURES (2)
  • Systematic Methods
    • Systematic methods are applied to score or evaluate assessment protocols.
  • Guidelines
    • Guidelines are available to understand and interpret the results of an assessment measure.
    • Such guidelines may make provision for the comparison of an individual’s performance to that of an appropriate norm group or to a criterion (e.g. competency profile for a job).
characteristics of assessment measures 3
CHARACTERISTICS OF ASSESSMENT MEASURES (3)
  • Evidence Based
    • Assessment measures should be supported by evidence that they are valid and reliable for the intended purpose.
    • The evidence is usually provided in the form of a technical test manual.
  • Context
    • Assessment measures are usually developed in a certain context (society or culture) for a specific purpose and the normative information used to interpret test performance is limited to the characteristics of the normative sample.
characteristics of assessment measures 4
CHARACTERISTICS OF ASSESSMENT MEASURES (4)
  • Test Bias
    • The appropriateness of an assessment measure for an individual, group, or organisation from another context, culture, or society cannot be assumed without an investigation into possible test bias (i.e. whether a measure is differently valid for different subgroups) and without strong consideration being given to adapting and re-norming the measure.
  • Multidimensional
    • Assessment process is multidimensional in nature.
    • It entails the gathering and synthesising of information as a means of describing and understanding functioning.
    • This can inform appropriate decision-making and intervention.
characteristics of assessment measures 5
CHARACTERISTICS OF ASSESSMENT MEASURES (5)
  • Limits of Human Wisdom
    • Recognise the limits of human wisdom when reaching opinions based on assessment information.
the need for control of assessment measures within the sandf 1
THE NEED FOR CONTROL OF ASSESSMENT MEASURES WITHIN THE SANDF (1)
  • Sensitive Item Content
    • In view of the potentially sensitive nature of some of the itemcontent and the feedback, and given that assessment measures can be misused, the use of assessment measures need to be controlled so that the public can be protected.
  • Trained Professionals
    • Controlling the use of psychological measures by restricting them to appropriately trained professionals.
the need for control of assessment measures within the sandf 2
THE NEED FOR CONTROL OF ASSESSMENT MEASURES WITHIN THE SANDF (2)
  • Practitioner Competency
    • Measures are administrated by a qualified, competentassessment practitioner and that assessment results are correctly interpreted and used.
  • Conveying the Results
    • The outcome of the assessment is conveyed in a sensitive, empowering manner rather than in a harmful way.
  • Psychometry Procurement
    • The purchasing of psychological assessment measures is restricted to those who may use them and that test materials are kept securely (as it is unethical for assessment practitioners to leave tests lying around) – this will prevent unqualified people from gaining access to and using them.
the need for control of assessment measures within the sandf 3
THE NEED FOR CONTROL OF ASSESSMENT MEASURES WITHIN THE SANDF (3)
  • Release of Assessment Materials
    • Test developers do notprematurely release assessmentmaterials (e.g. before validity and reliability have been adequately established), as it is unethical for assessment practitioners to use measures for which appropriate validity and reliability data have not been established.
  • Public Familiarity
    • The general public does not become familiar with the test content, as this would invalidate the measure.
control of psychological assessment measures within the sandf 1
CONTROL OF PSYCHOLOGICAL ASSESSMENT MEASURES WITHIN THE SANDF (1)
  • Statutory Control in RSA
    • In South Africa the use of psychological assessment measures is under statutory control.
    • A law (statute) has been promulgated that restricts the use of psychological assessment measures to appropriately registered psychology professionals.
  • Health Professions Act
    • Act 56 of1974 defines acts “specially pertaining to the profession of a psychologist”.
control of psychological assessment measures within the sandf 2
CONTROL OF PSYCHOLOGICAL ASSESSMENT MEASURES WITHIN THE SANDF (2)
  • Diagnosis
    • The evaluation of behaviour or mental processes or personality adjustments or adjustments of individuals or groups of persons, through the interpretation of tests for the determination of intellectual abilities, aptitude, interests, personality make-up or personality functioning, and the diagnosis of personality and emotional functions and mental functioning deficiencies according to a recognised scientific system for the classification of mental deficiencies.
control of psychological assessment measures within the sandf 3
CONTROL OF PSYCHOLOGICAL ASSESSMENT MEASURES WITHIN THE SANDF (3)
  • Method and Practice
    • The use of any method or practice aimed at aiding persons or groups of persons in the adjustment of personality, emotional or behavioural problems or at the promotion of positive personality change, growth and development, and the identification and the evaluation of personality dynamics and personality functioning according to psychological scientific methods.
control of psychological assessment measures within the sandf 4
CONTROL OF PSYCHOLOGICAL ASSESSMENT MEASURES WITHIN THE SANDF (4)
  • Evaluation
    • The evaluation of emotional, behavioural and cognitive processes or adjustment of personality of individuals or groups of persons by the usage and interpretation of questionnaires, tests, projections or other techniques or any apparatus, whether of South African origin or imported, for the determination of intellectual abilities, aptitude, personality make-up, personality functioning, psychophysiological functioning or psychopathology.
control of psychological assessment measures within the sandf 5
CONTROL OF PSYCHOLOGICAL ASSESSMENT MEASURES WITHIN THE SANDF (5)
  • Exercising of Control
    • The exercising of control over prescribed questionnaires or tests or prescribed techniques, apparatus or instruments for the determination of intellectual abilities, aptitude, personality make-up, personality functioning, psychophysiological functioning or psychopathology.
  • Development
    • The development of and control over the development of questionnaires, tests, techniques, apparatus or instruments for the determination of intellectual abilities, aptitude, personality make-up, personality functioning, psychophysiological functioning or psychopathology.
control of psychological assessment measures within the sandf 6
CONTROL OF PSYCHOLOGICAL ASSESSMENT MEASURES WITHIN THE SANDF (6)
  • Domain of Psychology
    • According to Act 56 of 1974, the use of measures to assess mental, cognitive, or behavioural processes and functioning, intellectual or cognitive ability or functioning, aptitude, interest, emotions, personality, psychophysiological functioning or psychopathology (abnormal behaviour), constitutes an act that fall in the domain of the psychology profession.
psychological assessment within the democratic south africa 1
PSYCHOLOGICAL ASSESSMENT WITHIN THE DEMOCRATIC SOUTH AFRICA (1)
  • Post 1994
    • Since 1994 and the election of South Africa’s first democratic government, the application, control, and development of assessment measures have become contested terrain.
  • Constitution and Labour Relations Act
    • With the adoption of the new Constitution and the Labour RelationsAct in 1996, worker unions and individuals now have the support of legislation that specifically forbids any discriminatorypractices in the workplace and includes protection for applicants as they have all the rights of current employees in this regard.
psychological assessment within the democratic south africa 2
PSYCHOLOGICAL ASSESSMENT WITHIN THE DEMOCRATIC SOUTH AFRICA (2)
  • Employment Equity Act
    • To ensure that discrimination is addressed within the testing arena, the Employment Equity Act No. 55 of 1998 (section 8) refers to psychological tests and assessment specifically and states that:
    • Psychological testing and other similar forms or assessments of an employee are prohibited unless the test or assessment being used:
      • has been scientifically shown to be valid and reliable;
      • can be applied fairly to all employees;
      • is not biased against any employee or group.
psychological assessment within the democratic south africa 3
PSYCHOLOGICAL ASSESSMENT WITHIN THE DEMOCRATIC SOUTH AFRICA (3)
  • Impact of Employment Equity Act
    • The impact of this Act on the conceptualisation and professionalpractice of assessment in South Africa in general is far-reaching as assessment practitioners and test publishers are increasingly being called upon to demonstrate, or prove in court, that a particular assessment measure does not discriminate against certain groups of people.
    • Despite the fact that the Employment Equity Act is not binding on Defence Act Personnel, Directorate Psychology is still obliged to ensure that its practices are fair and equitable.
fair and ethical practices in the use of psychological measures within the sandf
FAIR AND ETHICAL PRACTICES IN THE USE OF PSYCHOLOGICAL MEASURES WITHIN THE SANDF
  • International Guidelines on Test Use (Version 2000)
    • Fair Assessment Practices
      • The appropriate, fair, professional, and ethical use of assessment measures and assessment results.
      • Taking into account the needs and rights of those involved in the assessment process.
      • Ensuring that the assessment conducted closely matches the purpose to which the assessment results will be put.
      • Taking into account the broader social,cultural, and political context in which assessment is used and the ways in which such factors might affect assessment results, their interpretation, and the use to which they are put.
factors affecting psychological assessment results 1
FACTORS AFFECTING PSYCHOLOGICAL ASSESSMENT RESULTS (1)
  • Viewing Assessment Results in Context.
    • A test score in only one piece of information about how a person performs or behaves. Therefore, if we look at an individual in terms of a test score only, we will have a very limited understanding of that person.
    • A test score can never be interpreted without taking note of and understanding the context in which the score was obtained.
    • In addition to the test score, the information in which we are interested can be obtained by examining the context in which a person lives.
    • When you think about it, you will realise that people actually function in several different contexts concurrently.
factors affecting psychological assessment results 2
FACTORS AFFECTING PSYCHOLOGICAL ASSESSMENT RESULTS (2)
  • At the lowest level there is the biological context, referring to physical bodily structures and functions, which are the substrata for human behaviour and experiences.
  • Then there is the intrapsychic context which comprises abilities, emotions, and personal dispositions.
  • Biological and intrapsychic processes are regarded as interdependent components of the individual as a psychobiological entity.
  • In addition, because people do not live in a vacuum, we need to consider a third and very important context which is the social context.
factors affecting psychological assessment results 3
FACTORS AFFECTING PSYCHOLOGICAL ASSESSMENT RESULTS (3)
    • The social context refers to aspects of the environment in which we live such as our homes and communities, people with whom we interact, work experiences, as well as cultural and socio-political considerations.
  • Methodological Considerations
    • In addition to looking at the effects of the different contexts within which people function, we also need to examine methodological considerations such as test administration, which may also influence test performance and therefore have a bearing on the interpretation of a test score.
professional practices that assessment practitioners within the sandf should follow 1
Professional Practices that Assessment Practitioners within the SANDF should follow (1)
  • Rights of Test-takers
    • Informing test-takers about their rights and the use to which the assessment information will be put.
  • Informed Consent
    • Obtaining the consent of test-takers to assess them, to use the results for selection, placement, or training decisions and, if needs be, to report the results to relevant third parties.
  • Treatment
    • Treating test-takers courteously,respectfully, and in an impartial manner, regardless of culture, language, gender, age, disability, and so on.
professional practices that assessment practitioners within the sandf should follow 2
Professional Practices that Assessment Practitioners within the SANDF should follow (2)
  • Preparation
    • Being thoroughly prepared for the assessment session.
  • Confidentiality
    • Maintaining confidentiality to the extent that it is appropriate for fair assessment practices.
  • Language
    • Establishing what language would be appropriate and fair to use during the assessment and making use of bilingual assessment where appropriate.
  • Training
    • Only using measures that they have been trained to use.
professional practices that assessment practitioners within the sandf should follow 3
Professional Practices that Assessment Practitioners within the SANDF should follow (3)
  • Administration
    • Administering measures properly.
  • Scoring
    • Scoring the measures correctly and using appropriate norms or cutpoints or comparative profiles.
  • Background Information
    • Taking background factors into account when interpreting test performance and when forming an overall picture of the test-taker’s performance (profile).
professional practices that assessment practitioners within the sandf should follow 4
Professional Practices that Assessment Practitioners within the SANDF should follow (4)
  • Communication
    • Communicating the assessment results clearly to appropriate parties.
  • Subjectivity
    • Acknowledging the subjective nature of the assessment process by realising that the final decision that they reach, while based at times on quantitative test information, reflects their “best guess estimate”.
  • Utilisation of Assessment Information
    • Using assessment information in a fair, unbiased manner and ensuring that anyone else who has access to this information also does so.
professional practices that assessment practitioners within the sandf should follow 5
Professional Practices that Assessment Practitioners within the SANDF should follow (5)
  • Research
    • Researching the appropriateness of the measures that they use and refining, adapting, or replacing them where necessary.
  • Storage
    • Securely storing and controlling access to assessment materials so that the integrity of the measures cannot be threatened in any way.
basic statistical concepts reliability validity and norms 1
BASIC STATISTICAL CONCEPTS: RELIABILITY, VALIDITY AND NORMS (1)
  • Statistical Concepts
    • Psychological assessment measures often produce data in the form of numbers.
    • We need to be able to make sense of these numbers.
    • Basic statistical concepts can help us here, as well as when it comes to establishing and interpreting norm scores.
    • Statistical concepts and techniques can also help us to understand and establish basic psychometric properties of measures such as validity and reliability.
basic statistical concepts reliability validity and norms 2
BASIC STATISTICAL CONCEPTS: RELIABILITY, VALIDITY AND NORMS (2)
  • Reliability
    • This refers to the degree to which a psychometric test consistently produces the same results by the same candidates.
  • Validity
    • This refers to the degree to which the psychometric test measures what it claims to measure.
basic statistical concepts reliability validity and norms 238
BASIC STATISTICAL CONCEPTS: RELIABILITY, VALIDITY AND NORMS (2)
  • Norms
    • Norms refer to the recordsof performance by other candidates who have previously been assessed using the same test.
    • A candidate must be measured against norms taken from the context and population group to which that candidate belongs, i.e candidates are measured against other South African candidates who have previously undergone assessment on a specific test.
    • As a database of results is built, SANDF specific norms are developed and used.
    • The current SANDF database consists of primarily Black candidates.
challenges faced by psychology assessment practitioners within the sandf 1
CHALLENGES FACED BY PSYCHOLOGY ASSESSMENT PRACTITIONERS WITHIN THE SANDF (1)
  • Influence of Multiculturalism
    • In the latter part of the twentieth century and at the start of the twenty-first century, multiculturalism has become the norm in many countries.
    • As a result, attempts were made to develop tests that were “culture-free”.
    • It soon became clear that it was not possible to develop a test that is free of any cultural influences.
    • Consequently, test developers focused more on “culture-reduced”or “culture-common” tests in which the aim was to remove as much cultural bias as possible from the test by including only behaviour that was common across cultures.
challenges faced by psychology assessment practitioners within the sandf 2
CHALLENGES FACED BY PSYCHOLOGY ASSESSMENT PRACTITIONERS WITHIN THE SANDF (2)
  • For example, a number of non-verbal intelligence tests were developed (e.g. Raven Progressive Matrices) where the focus was on novel problem-solving tasks and in which language use, which is often a stumbling block in cross-cultural tests, was minimised.
  • In an attempt to address issues of fairness and bias in test use, the need arose to develop standards for the professional practice of testing and assessment.
challenges faced by psychology assessment practitioners within the sandf 3
CHALLENGES FACED BY PSYCHOLOGY ASSESSMENT PRACTITIONERS WITHIN THE SANDF (3)
  • Representivity of Assessors
    • Legitimate concern is sometimes expressed regarding the representivity of psychologists in the SANDF.
    • This is a challenge that the organisation is currently striving to meet and some degree of progress has already been made.
    • Directorate Psychology conducts targeted recruitment in order to recruit Black psychologists, and regularly engages the Professional Board for Psychology and academicinstitutions in this regard.
challenges faced by psychology assessment practitioners within the sandf 4
CHALLENGES FACED BY PSYCHOLOGY ASSESSMENT PRACTITIONERS WITHIN THE SANDF (4)
  • However, the lack of availability of Black psychologists in South Africa remains a challenge.
  • The Professional Board for Psychology’s official registrationstatistics reflect that 11% (known disclosures) of South African psychologists are Black.
challenges faced by psychology assessment practitioners within the sandf 5
CHALLENGES FACED BY PSYCHOLOGY ASSESSMENT PRACTITIONERS WITHIN THE SANDF (5)
  • Language
    • Language is generally regarded as the most important singlemoderator of performance on assessment measures.
    • This is because performance on assessment measures could be the product of language difficulties and not ability factors if a measure is administered in a language other than the test-taker’s home language.
    • When a test is written in a different language, it may present a rangeof concepts that are not accessible in our home language.
challenges faced by psychology assessment practitioners within the sandf 6
CHALLENGES FACED BY PSYCHOLOGY ASSESSMENT PRACTITIONERS WITHIN THE SANDF (6)
  • Current Dilemma regarding Psychometric Tests
    • Historically the Human Science Research Council (HSRC) was mandated to provide cost effective psychometric tests that had been proven to be valid within the South African population.
    • After the advent of democracy in the Republic of South Africa, the HSRC underwent transformation.
    • The HSRC redefined its role regarding psychometric tests and surrendered the license to most of these tests to the private sector.
    • This led to the current dilemma where there is a shortage of cost effective psychometric instruments that are approved for use in the Republic of South Africa.
challenges faced by psychology assessment practitioners within the sandf 7
CHALLENGES FACED BY PSYCHOLOGY ASSESSMENT PRACTITIONERS WITHIN THE SANDF (7)
  • The situation has reached critical proportions within the broader industry sector.
  • Consequently, members of the Professional Board forPsychology have indicated that the HSRC will be requested to provide this essential service to the nation.
  • Due to the scarcity of validated psychometry for the South African context, the South African National Defence Force has been obliged to develop or validate some psychometric tests for use within the organisation.
  • This is done in consultation with the Psychometric Committee of the Professional Board for Psychology.
scope49
SCOPE
  • Academic Aptitude Test (AAT).
  • Blox Test.
  • Differential Aptitude Test (DAT).
  • Raven’s Progressive Matrices (RPM).
  • Potential Insight Battery (PIB).
  • Psychological Risk Inventory (PRI).
  • Vienna Test System (VTS).
academic aptitude test 1
ACADEMIC APTITUDE TEST (1)
  • Origin
    • South Africa.
    • Human Sciences Research Council.
    • Representative sample.
    • Different languages.
      • Northern Sotho Zulu
      • Southern Sotho Afrikaans
      • Tswana English
      • Tsonga Other
      • Venda
      • Xhosa
academic aptitude test 151
ACADEMIC APTITUDE TEST (1)
  • Aim
    • To serve as an objective, reliable and valid aid in the guidance of candidates in respect of subject and occupational choice.
    • Provides an indication of a candidate’s:
      • General intellectual ability (intelligence).
      • Verbal ability and the level achieved in the official languages.
      • Mathematical ability.
      • Level of spatial ability.
academic aptitude test 2
ACADEMIC APTITUDE TEST (2)
  • Description
    • Consists of 9 tests with 37 items in the first and 33 items in each of the other tests.
    • All items are of multiple choice type
    • Correct answer which the candidate can choose from five possibilities, is indicated on a separate answer sheet.
academic aptitude test 3
ACADEMIC APTITUDE TEST (3)

AAT 1: Non-verbal reasoning

  • Measures the ability to reason inductively.
  • Consists of two parts, viz.
    • Figure series.
    • Pattern completion.
  • Figure series:
    • Four figures are given and the fifth figure in the series must be selected from the given possibilities.
  • Pattern completion:
    • Total picture must be formed of the matrix, a rule deduced and the matrix completed accordingly.
    • The candidate is consequently expected to deduce and apply a general principle.
  • The test should in conjunction with the verbal score, provide a good indication of general intellectual ability.
academic aptitude test 4
ACADEMIC APTITUDE TEST (4)
  • AAT 2: Verbal reasoning
    • Candidates are required to grasp verbal concepts and their relationships.
    • Inductive as well as deductive reasoning is required.
    • Items include analogies, letter codes and logical deductions.
academic aptitude test 5
ACADEMIC APTITUDE TEST (5)
  • AAT 5: Number comprehension
    • Ability to manipulate and apply fundamental principles and operations.
    • Items include, inter alia, percentages, fractions, exponents and basic sets.
  • Reliability
    • Degree of accuracy and consistency.
    • Reliability coefficients: Vary from 0.69 to 0.90 for the individual tests.
    • Reliability in the SANDF; .74
academic aptitude test norms 7
ACADEMIC APTITUDE TEST:NORMS (7)
  • Implementation
    • Military Skills Development
    • Youth Foundation Programme
    • Nursing Study Scheme
    • Other Study Schemes
    • Pilot Selection
blox test 1
BLOX TEST (1)
  • ORIGIN
    • South Africa.
    • Human Sciences Research Council.
    • Previously known as Perceptual Battery.
    • Determine own norms for SANDF Population.
blox test 2
BLOX TEST (2)
  • AIM
    • Measures visual orientation.
    • Ability to comprehend the nature of arrangements within visual stimulus pattern primarily with respect to candidate’s body or frame of reference.
    • Ability to recognise spatial arrangements from different orientations without the benefit of physical shifts of the body.
    • Recognise the same visual stimulus pattern from different angles.
    • Ability to manipulate (rotate, twist) on or two parts of a visual stimulus pattern in the candidate’s imagination in order to recognise change appearance of the object.
blox test 3
BLOX TEST (3)
  • Description
    • Test format
      • Paper and pencil test.
      • Consists of 6 practice items and 45 test items.
      • Non-verbal test.
  • Rationale
    • Spatial ability consists of spatial relations and orientation.
    • The ability to comprehend the nature of arrangements within a visual stimulus pattern primarily wrt the examiner’s body or frame of reference.
blox test 4
BLOX TEST (4)
  • Item format
    • Isometric drawings of different combinations of two, three, four, five or six cubes.
    • Each set of cubes must be compared to similar arrangements of cubes viewed from other angles.
    • Each page is divided into two sections with a easy black line.
    • Above the line are five sets of cubes which are the responses and below the line are nine sets of cubes which form the stimuli.
    • Candidate must analyse each stimulus set and choose the corresponding set seen from a different angle, from the five possible responses.
  • Time required
    • Time limit for the test is 30 minutes.
blox test 5
BLOX TEST (5)
  • Reliability
    • Reliability in the SANDF: .72
blox test 7
BLOX TEST (7)
  • Implementation
    • Apprentices.
    • Youth Foundation Training.
    • Explosive Device Disposal Operator.
    • VIP Protector.
    • Pilot Selection.
differential aptitude test dat 1
DIFFERENTIAL APTITUDE TEST (DAT) (1)
  • ORIGIN
    • South Africa.
    • Human Sciences Research Council.
    • Standardised:
      • Blacks
      • Coloureds
      • Whites
      • Indians
differential aptitude test dat 2
DIFFERENTIAL APTITUDE TEST (DAT) (2)
  • AIM
    • Provide information on candidates who want to undergo tertiary training or gain entry to particular high-level occupations, especially with the view to the provision of counselling, and the placement in and selection for tertiary or other post-school training and specific occupations.
differential aptitude test dat 3
DIFFERENTIAL APTITUDE TEST (DAT) (3)
  • Rationale
    • Aptitude is the potential a candidate has which will enable him/her to achieve a certain level of ability with a given amount of training and/or practice.
    • Aptitude, together with interest, attitude, motivation and other personality characteristics, will to a large extend determine the ultimate success of a candidate.
    • Aptitude with other information, predict possible success in a specific field of study/training programme/occupation should a candidate make a particular choice, ort should the employer wish to make a particular appointment.
differential aptitude test dat 4
DIFFERENTIAL APTITUDE TEST (DAT) (4)
  • Description of tests
    • Vocabulary.
    • Verbal reasoning.
    • Non-verbal reasoning.
    • Calculations.
    • Reading comprehension.
    • Comparison.
    • Price controlling.
    • Spatial visualisation.
    • Mechanical insight.
    • Memory.
differential aptitude test dat 5
DIFFERENTIAL APTITUDE TEST (DAT) (5)
  • Test 1: Vocabulary
    • Aim: To measure Verbal Comprehension, which can be defined as knowledge of words and their meaning, as well as the application of this knowledge in spoken and written language.
    • Rationale: The ability of a learner to recognise a word and to choose a synonymous word is regarded as a valid indication of his/her knowledge of the meaning of words and as a valid criterion for the verbal comprehension factor.
differential aptitude test dat 6
DIFFERENTIAL APTITUDE TEST (DAT) (6)
  • Test 2: Verbal Reasoning
    • Aim: To measure an aspect of general reasoning on the basis of verbal material.
    • Rationale: The assumption that the ability to determine relationships, to complete word analogies, to solve general problems requiring logical thought, as well as a candidate’s vocabulary, is a valid indication of an aspect of general reasoning.
differential aptitude test dat 7
DIFFERENTIAL APTITUDE TEST (DAT) (7)
  • Test 3: Non-verbal Reasoning: Figures
    • Aim: To measure an aspect of general reasoning on the basis of non-verbal material.
    • Rationale: Assumption that the ability to see relationships between figures and, by analogy, to identify an appropriate missing figure, as well as, following the changes that the figures of a figure series undergo, to deduce the work principle and to apply it again, is a valid indication of an aspect of non-verbal reasoning ability.
differential aptitude test dat 8
DIFFERENTIAL APTITUDE TEST (DAT) (8)
  • Test 4: Calculations
    • Aim: To measure arithmetical ability.
    • Rationale: Assumption that the candidate’s ability to do mechanical calculations and to solve arithmetical problems with the help of four basic arithmetic operations, namely adding, subtracting, dividing and multiplying, provides a valid indication of his/her arithmetical ability.
differential aptitude test dat 9
DIFFERENTIAL APTITUDE TEST (DAT) (9)
  • Test 5: Reading Comprehension
    • Aim: To measure the ability to comprehend what the candidate is reading.
    • Rationale: Assumption that the candidate’s ability to choose the right answers to questions on prose passages is a valid indication of reading comprehension.
differential aptitude test dat 10
DIFFERENTIAL APTITUDE TEST (DAT) (10)
  • Test 6: Comparison
    • Aim: To measure visual perceptual speed as a certain aspect of clerical ability, which consists mainly of the quick and accurate perception of differences and similarities between visual configurations.
    • Rationale: Assumption that the ability to quickly and accurately indicate from five symbol groups the one that corresponds precisely with a given symbol group, is a valid indication of visual perceptual speed.
differential aptitude test dat 11
DIFFERENTIAL APTITUDE TEST (DAT) (11)
  • Test 7: Price Controlling
    • Aim: To measure a general speed of clerical ability, namely the ability to look up data quickly and accurately.
    • Rationale: Assumption that the ability to look up the prices of articles in a table quickly and accurately is a valid indication of success in numerous clerical tasks.
differential aptitude test dat 12
DIFFERENTIAL APTITUDE TEST (DAT) (12)
  • Test 8 Spatial Visualisation 3-D
    • Aim: To measure the three-dimensional spatial perceptual ability.
    • Rationale: Assumption to-
      • Manipulate mentally a cube whose sides are marked in a certain way and which is presented three dimensionally in such a way that the relative position of a certain cube to that of a given cube can be determined.
      • Recognise and indicate certain sides of a flat figure that has been folded to make a three-dimensional figure.
      • Visualise what the three-dimensional result will be if a flat figure is rolled up or folded
      • Is a valid criterion of three-dimensional spatial perceptual ability.
differential aptitude test dat 13
DIFFERENTIAL APTITUDE TEST (DAT) (13)
  • Test 9: Mechanical Insight
    • Aim: To measure mechanical ability (insight).
    • Rationale: Assumption that the ability to make correct visual representation of the result of the operation of a mechanical apparatus or a physical principle depicted in a drawing, is a valid criterion for the measurement of mechanical ability.
differential aptitude test dat 14
DIFFERENTIAL APTITUDE TEST (DAT) (14)
  • Test 10: Memory
    • Aim: To measure an aspect of the memory factor by using meaningful material.
    • Rationale: Assumption that the ability to memorise meaningful material summarised in written paragraphs and then to correctly answer questions on the content of the paragraphs, is a valid criterion for measuring an aspect of memory.
differential aptitude test dat 15
DIFFERENTIAL APTITUDE TEST (DAT) (15)
  • Reliability
    • Overall reliability in the SANDF: .68 -.74
differential aptitude test dat 17
DIFFERENTIAL APTITUDE TEST (DAT) (17)
  • Implementation
    • Army Musterings.
    • Navy Musterings.
    • Explosive Device Disposal Operator.
ravens progressive matrices 1
RAVENS PROGRESSIVE MATRICES (1)
  • ORIGIN
    • United Kingdom.
    • International application.
    • Minimise cultural influences..
    • Non-verbal test.
ravens progressive matrices 2
RAVENS PROGRESSIVE MATRICES (2)
  • AIM
    • To measure the candidate’s capacity to apprehend meaningless figures presented for his/her observation, see the relations between them, conceive the nature of the figure completing each system of relations presented, and by so doing, develop a systematic method of reasoning.
    • Suitable for comparing candidates wrt their immediate capacities for observation and clear thinking.
ravens progressive matrices 3
RAVENS PROGRESSIVE MATRICES (3)
  • DESCRIPTION
    • Consists of 60 problems, which is divided into 5 sets of 12 each.
    • In each set the first problem is as closely as possible self-evident.
    • Problems, which follow, become progressively more difficult.
    • The order of the items provided the standard training in the method of working.
    • Five sets provide five opportunities for grasping the method and five progressive assessments of a candidate’s capacity for intellectual activity.
ravens progressive matrices 4
RAVENS PROGRESSIVE MATRICES (4)
  • DESCRIPTION (cont)
    • Test is developed to evaluate the full spectrum of a candidate’s intellectual development.
    • Test can be applied to any age group.
    • Scale is intended to cover the whole range of intellectual development from the time a child is able to grasp the idea of finding a missing piece to complete a pattern to the stage of intellectual maturity through a process of comparison and reasoning.
    • The score for adults tend to be above average, but the scale provides sufficient discriminating value.
    • Where more differentiation is needed, the Advanced Ravens must be used.
ravens progressive matrices 5
RAVENS PROGRESSIVE MATRICES (5)
  • IMPLEMENTATION
    • Test is included in some selection batteries and provides a basis for evaluation of general abilities.
    • No time limits for administration of the test.
    • Time taken to complete the test must be indicated.
  • Reliability:
    • Reliability in the SANDF: .80
ravens progressive matrices 7
RAVENS PROGRESSIVE MATRICES (7)
  • IMPLEMENTATION
  • Special Forces Selection.
  • Apprentices.
  • Explosive Device Disposal Operator.
  • VIP Protector.
potential index batteries 1
POTENTIAL INDEX BATTERIES (1)
  • Potential Index Batteries (PIB)
    • Job Profiling Expert.
    • Comprehensive Structured Interviewing for Potential.
    • Situation Specific Evaluation Expert.
    • Performance Appraisal Scoring Scale.
potential index batteries 2
POTENTIAL INDEX BATTERIES (2)
  • ORIGIN
    • South Africa.
    • Based on ongoing research that dates back to 1964.
    • Applied research done by reputable, independent institutions.
    • Situation-specific norms and state-of-the-art, computerised standardisation procedure.
    • Generic standardisation done on a population of approximately 31 000 respondents.
potential index batteries 3
POTENTIAL INDEX BATTERIES (3)

PROCESS

  • Job profiling.
  • Determining job-related competencies.
  • Determining NQF level and job grade.
  • Job description.
  • Critical crossfield education and training outcomes.
  • Comprehensive structured interviewing for potential.
  • Job profiling expert (basic competencies)
  • Performance appraisal scoring scale.
    • Ongoing feedback on workers performance.
    • Ongoing identification of training and development needs.
    • Career pathing.
potential index batteries jp expert 6
POTENTIAL INDEX BATTERIES: JP EXPERT (6)
  • IMPLEMENTATION
  • Post Profiling for Specific Musterings.
  • Explosive Device Disposal Operator.
  • VIP Protector.
psychological risk inventory 1
PSYCHOLOGICAL RISK INVENTORY (1)
  • ORIGIN
    • South Africa.
    • Developed by SANDF Psychologists.
psychological risk inventory 2
PSYCHOLOGICAL RISK INVENTORY (2)
  • AIM
    • To scan for self-reported symptoms of psychopathology.
    • To determine the need for an interview.
    • To recommend the candidate for deployment or not.
    • Utilised for concurrent health assessment processes.
    • To confirm the mental health status in adhering to set standards for deployment.
psychological risk inventory 3
PSYCHOLOGICAL RISK INVENTORY (3)
  • DESRIPTION
    • Consists of 92 multiple-choice items.
    • Each item consists of a short statement with three possible answers.
    • Screening is focused on identification of psychopathology.
  • Psychological fitness
    • The concurrent health assessment defines psychological fitness as the absence of diagnosable psychopathology.
psychological risk inventory 4
PSYCHOLOGICAL RISK INVENTORY (4)
  • Psychopathology
    • SANDF mental health standards are based on the Diagnostic and Statistical Manuel of Mental Disorders (Revised) (DSM-IV-R).
    • The United Nations (UN) indicates that members should not deploy if they have a history of substance dependence, situational maladjustment, anxiety disorder or are on chronic medication.
psychological risk inventory 5
PSYCHOLOGICAL RISK INVENTORY (5)
  • CATECORIES OF SCALES
    • C Coping scales: Less serious pathology scales and consists of -
      • C1-Stress indicator: The experience of pressure from the environment ranging from work pressure, work environment pressure, financial pressure, family problems and interpersonal pressure.
      • C2-Coping indicator: Reflects the subjective experience of negative emotions indicating that the candidate is not emotionally coping well.
psychological risk inventory 6
PSYCHOLOGICAL RISK INVENTORY (6)
    • C3-Ego strength: Provides an indication of the candidate’s stress tolerance and inner resources to deal with daily challenges.
  • D-Pathology scales: More serious pathology scales.
    • D1-Mood disorder: Indicates symptoms of depression.
    • D2-Anxiety: Indicates symptoms of anxiety.
    • D3-Psychotic features: Indicates thought process and content disorder and other symptoms related to psychosis.
psychological risk inventory 7
PSYCHOLOGICAL RISK INVENTORY (7)
    • D4-Somatic disorder: Indicates pre-occupation with symptoms of a physical nature.
  • P-Interpersonal scale:
    • P1-Interpersonal conflict: Indicates symptoms of interpersonal conflict, lack of interpersonal trust and unstable relationships.
  • R-Psychological risk scales: Indicate specific risks that must be noted w.r.t. deployment.
    • R1-Control risk: Indicates tendencies to be unstable or impulsive and not well controlled by self and authority.
    • R2-Suicide risk: Indicates suicidal ideation and negativity about life in general.
psychological risk inventory 8
PSYCHOLOGICAL RISK INVENTORY (8)
  • R3-PTSD risk: Indicates that the candidate has been exposed to one or more traumatic event(s) which has not been resolved.
  • R4-Substance abuse risk: Indicates self-reported excessive drinking over a recent period of time.
  • R5-Aggression risk: Indicates tendencies to express aggressive behaviour due to frustration or interpersonal conflict.
psychological risk inventory 10
PSYCHOLOGICAL RISK INVENTORY (10)
  • IMPLEMENTATION
    • Pre-deployment mental health assessments.
challenges 1
CHALLENGES (1)
  • Development of a SANDF Competency Assessment Test
    • Explore the feasibility and the requirements of instituting a competency assessment test for enlisted soldiers.
    • Develop and sustain a competency assessment program for evaluating soldiers’ technical and tactical proficiency in the military occupational specialty and leadership skills for their rank.
    • Include situational judgment test items.
challenges 2
CHALLENGES (2)
  • Sample item for assessing performance dimensions (Benchmarking):
    • Problem Solving and Decision Making Skills
    • Motivating, Leading and Supporting Sub-ordinates
    • Directing, Monitoring and Supervising Work
    • Training Others
    • Relating to and Supporting Peers
    • Team leadership
    • Concern for Soldier Quality of Life
    • Cultural Tolerance
    • Computer-based testing
scope108
SCOPE
  • ASSESSMENT CENTRE DEFINED
  • HISTORY AND BACKGROUND
  • LEGAL AND STATUTORY REQUIREMENTS
  • METHODOLOGY
  • VALIDITY AND RELIABILITY
  • DIVERSE APPLICATIONS
  • APPLICATION OF PRINCIPLES TO ASSESSMENT CENTRE DESIGN
  • MILITARY COUNCIL DECISIONS: ROLE OF SAMHS
assessment centre defined
ASSESSMENT CENTRE DEFINED
  • An Assessment Centre consists of a standardised and validated evaluation of behaviour and competencies based on multiple inputs.
  • Multiple trained observers and techniques are used.
  • Judgments about behaviour and competencies are made from specifically developed assessment simulations.
  • Judgments are pooled in a meeting among assessors or by a statistical integration process.
  • Integration discussions result in evaluations of the performance of the assessees on the competencies and/or dimensions or other variables which the assessment centre is designed to measure.
history and background 1
HISTORY AND BACKGROUND (1)
  • First conceptualised on a large scale by the German High Command in World War I to select officers with exceptional command or military abilities.
  • British Army War Office Selection Board (WOSB) developed a similar process.
  • During World War II it was used by the Office of Strategic Services (OSS) to select spies.
  • OSS 3 ½ day assessment centre involved an intensive evaluation: Sentence completion test, health questionnaire, work conditions survey, vocabulary test, personal history evaluation, a projective questionnaire and various simulations.
history and background 2
HISTORY AND BACKGROUND (2)
  • In the early 1950s, the American Telephone and Telegraph Company adapted the OSS concept to the selection and identification of management personnel.
  • By the late 1960s, a number of major corporations were using the AC for selecting managers.
  • In the early 1970s, law enforcement agencies began experimenting with AC, with the fire service following shortly thereafter.
  • AC’s are now used internationally.
legal and statutory requirements
LEGAL AND STATUTORY REQUIREMENTS
  • White Paper on Public Service Training and Education
  • Government Gazette
  • Employment Equity Act
  • Health Professions Act
  • Military Council Decision
white paper on public service training and education 1
WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (1)
  • All public service institutions will be required to conduct job evaluations or re-evaluations of all posts, with the purpose of ensuring that they are expressed in terms of the essential competencies required for effective job performance.
  • This will involve both functional or sector-specific competencies and core transversal competencies.
white paper on public service training and education 2
WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (2)
  • In the case of transversal competencies, the definition of competence will encompass a broad range of skills, knowledge and attitudes, including:
    • The ability to carry out effectively the routine task of the job.
    • The ability to transfer skills, knowledge and attitudes to new situations within the same occupational area.
    • The ability to reflect on one’s work, learn from one’s actions, and innovate and cope with non-routine activities.
    • The personal effectiveness to deal effectively with co-workers, managers and customers.
white paper on public service training and education 3
WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (3)
  • The introduction of a competency-based approach will assist the development of an outcomes-led model of training and education in a number of important ways. This will include forming an effective and measurable basis:
    • For the objective evaluation of current performance, and the effective assessment of current and future needs.
    • For the design and delivery of training programmes and courses, as well as other staff development interventions, targeted at the achievement of specific and meaningful competencies.
white paper on public service training and education 4
WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (4)

- For the standardisation and accreditation of such programmes and courses through the NQF framework.

- For the subsequent evaluation of the effectiveness of such programmes and courses.

  • The introduction of a competency-based approach will also form the basis for improvements in the current systems of performance appraisal, recruitment and selection, and promotion.
white paper on public service training and education 5
WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (5)
  • COMPETENCIES:
    • Knowledge
    • Skills
    • Abilities
    • Attributes

That employees develop through formal, informal and on the job training, continuing education, details and other employee development opportunities.

white paper on public service training and education 6
WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (6)

LINK

COMPETENCIES – JOB PERFORMANCE

COMPETENCIES

KNOWLEDGE

SKILLS

ABILITIES

ATTRIBUTES

OBSERVABLE

BEHAVIOUR

JOB

PERFORMANCE

EXPERIENCE

white paper on public service training and education 7
WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (7)

COMPETENCIES: SANDF PERFORMANCE APPRAISAL

  • Visioning
  • Conceptualisation
  • Insight
  • Judgement
  • Analytical Thinking
  • Strategic Planning
  • Leadership
  • Evaluating
white paper on public service training and education 8
WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (8)

COMPETENCIES: SAMHS

  • Problem Solving
  • Planning and Organisation
  • Delegation
  • Control
  • Sensitivity
  • Negotiation
  • Leadership
  • Assertiveness
  • Communication
white paper on public service training and education 8121
WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (8)

COMPETENCIES: SA NAVY

  • Communication
    • Reading
    • Writing
    • Oral
    • Non-verbal
    • Formal Research
white paper on public service training and education 9
WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (9)

COMPETENCIES: SA NAVY (cont)

  • Management
    • Planning
    • Effective Thinking
    • Quantitative Problem Solving
    • Qualitative Problem Solving
    • Directing
    • Organising
    • Control
white paper on public service training and education 10
WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (10)

COMPETENCIES: SA ARMY (SHL)

  • Planning
  • Reviewing/Evaluating
  • Deciding
  • Implementing/Coordinating
  • Interpreting
  • Controlling/Directing
  • Motivating
  • Supervising/Directing
  • Investigating/Observing/Searching
  • Informing/Discussing/Interviewing
  • Problem Solving/Designing
  • Assessing/Evaluating
white paper on public service training and education 11
WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (11)

COMPETENCIES: JSCSP

  • Cognitive
    • Problem Solving & Analysis
    • Planning & Organising
    • Leadership/Coordinating
    • Decisive/Action Orientated
  • Affective
    • Integrity
    • Persuasiveness
    • Self-confidence
    • Personal Motivation
    • Resilience
    • Flexibility
    • Interpersonal Sensitivity
white paper on public service training and education 12
WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (12)

COMPETENCIES: SENIOR MANAGEMENT SYSTEM (SMS)

  • Strategic Capability & Leadership
  • Programme & Project Management
  • Financial Management
  • Change Management
  • Knowledge Management
  • Service Delivery & Innovation
  • Problem Solving & Analysis
  • People Management & Empowerment
  • Client Orientation & Customer Focus
  • Communication
  • Honesty & Integrity
government gazette 1
GOVERNMENT GAZETTE (1)
  • Establishment of category of fitness:
    • The Surgeon General or a medical officer designated by him or her for that purpose shall, from time to time, in consultation with the Chief of the Service or Corporate Division concerned, determine the standard of physical and mental fitness required in peace or war time for the efficient work performance of a member in every Service or Corporate Division in each branch, corps, or unit thereof and in each mustering, appointment, post or job classification in the SANDF, taking into account requirements laid down by the relevant Code of Remuneration or Personnel Management Code and the Chief of the SANDF.
government gazette 2
GOVERNMENT GAZETTE (2)

- No member shall be appointed, enrolled, mustered or employed in any post or mustering of the SANDF or be required to serve or to undergo training in such post or mustering unless the allotted fitnesscategory of such member equals or exceeds the category designated to such post or mustering.

employment equity act 1
EMPLOYMENT EQUITY ACT (1)
  • Addresses the elimination and prohibition of unfair discrimination in employment.
  • Prohibition of various forms of employment testing.
  • Elimination of unfair discrimination in the policies and practices of the organisation.
  • Scrutinise policies, practices and procedures, particularly, pre-employment, job assignments, training and development and promotional selection.
employment equity act 2
EMPLOYMENT EQUITY ACT (2)
  • Psychological testing and other similarassessments of an employee are prohibited unless the test or assessment being used:
    • Has been scientifically shown to be valid and reliable;
    • Can be applied fairly to all employees; and
    • Is not biased against any employee or group.
  • Psychological testing and other similar assessment of an employee must only be done on the basis of the inherent requirements of the job.
methodology 1
METHODOLOGY (1)
  • OBSERVING AND RECORDING BEHAVIOUR
  • ASSESSOR DISCUSSION
  • SITUATIONAL EXERCISES
  • PSYCHOMETRIC TESTING
methodology 2
METHODOLOGY (2)
  • OBSERVING AND RECORDING BEHAVIOUR:
    • Observing and recording behaviour exhibited by participants during simulation exercise.
    • Behaviour observed and recorded during oral presentation, group discussion, interview, etc.
    • Behaviour is recorded according to dimension and/or competency.
    • Recorded behaviour is transferred onto rating form.
methodology 3
METHODOLOGY (3)
  • ASSESSOR DISCUSSION:
    • Each participant’s performance is discussed by exercise, dimension and competency.
    • Assessors may ask for clarification or additional recorded examples of behaviour or competency.
    • Attempts must be made to reach consensus.
methodology 4
METHODOLOGY (4)
  • SITUATIONAL EXERCISES:
    • Theoretical exercises (Case Studies).
    • Practical or dynamic exercises.
    • In basket.
    • Written case analysis.
    • Interview.
    • Leaderless group discussion.
    • Assigned leader group task.
    • Fact finding.
    • Oral presentation.
    • Integrated exercises.
methodology 5
METHODOLOGY (5)
  • PSYCHOMETRIC TESTING:
    • The evaluation of behaviour or mental processes or personality adjustments or adjustments of individuals or groups of persons, through the interpretation of tests for the determination of intellectual abilities, aptitude, interests, personality make-up or personality functioning.
    • The development and control over the development of questionnaires, tests, techniques, apparatus or instruments for the determination of intellectual abilities, aptitude, personality make-up, personality functioning, psycho-physiological functioning or psychopathology.
validity and reliability 1
VALIDITY AND RELIABILITY (1)
  • Validity: The degree to which a method results in a measure that accurately reflects the concept it is intended to measure.
    • A synonym for validity is accuracy.
  • Reliability:The degree to which different methods of the same concept yield the same results.
    • A synonym for reliability is consistency.

NB:Low reliability leads to low validity and vice versa.

validity and reliability 2
VALIDITY AND RELIABILITY (2)
  • 3 Types of reliability:
    • Inter-rater reliability.
    • Test-retest reliability.
    • Internal consistency.

NB: Research indicates high correlation and coeficiency of these three methods.

    • Strong indications that AC is a valid prediction instrument for leadership and management potential.
validity and reliability 3
VALIDITY AND RELIABILITY (3)
  • FAILURES OF ASSESSMENT CENTRES:
    • Never implemented.
    • Results misused.
    • Non-success prediction.
    • Lack of support from the Top.
    • Lack of Efficiency.
validity and reliability 4
VALIDITY AND RELIABILITY (4)
  • ASSESSOR TRAINING:
    • Thorough knowledge of the organisation and job being assessed.
    • Thorough knowledge and understanding of the assessment techniques, relevant dimensions, etc., to be observed, expected or typical behaviours, examples or samples of actual behaviour, etc.
    • Thorough knowledge and understanding of the assessment dimensions, etc., definitions of dimensions, relationship to job performance.
validity and reliability 5
VALIDITY AND RELIABILITY (5)
  • Demonstrated ability to record and classify behaviour in dimensions, including knowledge of forms used by AC.
  • Thorough knowledge and understanding of evaluation and rating procedures, including how data are integrated.
  • Thorough knowledge and understanding of assessment policies and practices of the organisation, including restrictions on how assessment data are to be used.
validity and reliability 6
VALIDITY AND RELIABILITY (6)
  • Thorough knowledge and understanding of feedback procedures, where appropriate.
  • Demonstrated ability to give accurate oral and written feedback.
  • Demonstrated knowledge and ability to play objectively and consistently the role called for in the interactive exercises.
diverse applications of assessment centres
DIVERSE APPLICATIONS OF ASSESSMENT CENTRES
  • Recruitment
  • Selection
  • Placement
  • Performance appraisal
  • Training and development
  • Organisational development
  • Human resource planning
  • Promotion and transfer
  • Separation and layoffs (Exit)
application of principles to assessment centre design 1
APPLICATION OF PRINCIPLES TO ASSESSMENT CENTRE DESIGN (1)
  • A job analysis of relevant competencies must be conducted to determine the dimensions, attributes, characteristics, qualities, skills, motivation, knowledge, or tasks that are necessary for effective job performance and to identify what should be evaluated by the assessment centre.
  • Competency observations must be classified into some meaningful and relevant categories, such as dimensions, attributes, characteristics, aptitudes, qualities, skills, abilities, knowledge, or tasks.
application of principles to assessment centre design 2
APPLICATION OF PRINCIPLES TO ASSESSMENT CENTRE DESIGN (2)
  • The techniques used in the assessment centre must be designed to provide information for evaluating the dimensions, etc. previously determined by job analysis.
  • Multiple assessment techniques must be used.
  • The assessment techniques must include sufficient job-related simulations to allow multiple opportunities to observe the candidate’s competencies related to each dimension, etc. being assessed.
  • Multiple assessors must be used for each assessee.
application of principles to assessment centre design 3
APPLICATION OF PRINCIPLES TO ASSESSMENT CENTRE DESIGN (3)
  • Assessors must receive thorough training and demonstrate assessor performance guidelines.
  • Some systematic procedure must be used by assessors to record accurately specific competency observations at the time of their occurrence; this might involve handwritten notes, competency observation scales, competency checklists, etc.
  • Assessors must prepare some report or record of the observations made in each exercise in preparation for the integration discussion.
application of principles to assessment centre design 4
APPLICATION OF PRINCIPLES TO ASSESSMENT CENTRE DESIGN (4)
  • The integration of competencies must be based on a pooling of information from assessors and techniques at a meeting among the assessors or through a statistical integration process validated in accord with professionally accepted standards.
military council decisions role of samhs 1
MILITARY COUNCIL DECISIONS: ROLE OF SAMHS (1)
  • MC DECISION JUNE 2001:
    • Establishment of the Defence Institute for Assessment and Development.
    • Transfer of the SANDF Assessment and Development Services currently vested at C Joint Training Formation (as agreed during transformation principle decisions) to the SAMHS for functional control purposes and integration into the Defence Institute for Assessment and Development.
    • Assessment and development services and personnel structure of the Defence Institute for Assessment and Development.
    • Phases of implementation for establishing the Defence Institute for Assessment and Development.
military council decisions role of samhs 2
MILITARY COUNCIL DECISIONS: ROLE OF SAMHS (2)
  • Surgeon General is mandated and designated to be the controlling authority for all statutory psychological and other similar assessment processes to ensure compliance of all policies and practices with legal and statutory acts and regulations.
  • HPCSA’s Professional Board of Psychology is the controlling statutory body with the authority to classify and legalise the use of psychological tests, prescribed questionnaires, apparatus and instruments for the determination of intellectual ability, aptitude, personality functioning and the like.
military council decisions role of samhs 3
MILITARY COUNCIL DECISIONS: ROLE OF SAMHS (3)
  • Employers who make use of psychological tests, in terms of the EEAct, may be required to ensure not only that the tests meet the standards of the Professional Board of Psychology, but also that the tests and testers meet the requirements of the Health Professional Act 56 of 1974.
  • In accordance with SAQA it has become imperative to apply the guidelines of the National Qualification Framework wrt outcome-based education/performance.
  • The AC service must support the DOD Human Resource Strategy 2010.
military council decisions role of samhs 4
MILITARY COUNCIL DECISIONS: ROLE OF SAMHS (4)
  • AC will aspire towards maintaining competency levels and enhancing productivity across/within the DOD, by means of identifying strengths and potential as well as gaps in competencies.
  • AC will make an essential contribution to the human resource management function in the DOD and will assist the DOD in its objective to develop and sustain optimal skills and competencies in all its members, entrenching a learning culture, and to enhance equal opportunity.
military council decisions role of samhs 5
MILITARY COUNCIL DECISIONS: ROLE OF SAMHS (5)
  • Scientifically and legally acceptable to relevant stakeholders.
  • Acceptable to relevant stakeholders in the DOD as an effective and applicable tool in their respective environment.
  • Encompass the entire scope of assessment/development service from entry to exit level.
military council decisions role of samhs 6
MILITARY COUNCIL DECISIONS: ROLE OF SAMHS (6)
  • Rendered on the premise of an integrated human resource management system based on a competency framework.
  • Functional control should be vested with the Surgeon General as been mandated in order to comply with statutory regulations and acts.
  • Affordable (no duplications), sustainable, easily accessible to its users and satisfactory to the needs of its clients.
military council decisions role of samhs 7
MILITARY COUNCIL DECISIONS: ROLE OF SAMHS (7)
  • Defence Assessment and Development Advisory Board, chaired by Director Psychology, must determine policy and guidelines for all kinds of Assessment and Development in the SANDF.
  • Advisory Board will consist of all relevant stakeholders from all arms of services and divisions in the SANDF and will primarily be responsible for controlling all assessments in the SANDF in line with legal, statutory and scientific requirements.
military council decisions role of samhs 8
MILITARY COUNCIL DECISIONS: ROLE OF SAMHS (8)

MIGRATION OF ARMY ASSESSMENT CENTRE TO SAMHS (MPI)

  • An Assessment Centre is a specialised function and remains the responsibility of D Psychology.
  • The function should be with MPI (DOD Assessment Centre) with a Service Agreement between MPI and the SA Army to deliver an on site service to the SA Army according to requirements.
  • The current structure of the SA Army should be translated to a structure as proposed by D Psychology, to be aligned with the DOD Assessment Centre.
  • The SA Army will remain responsible for providing the facilities and infrastructure.
slide154

THE APPLICATION OF SPECIALIST PSYCHOLOGICAL MEASURES IN THE RECRUITMENT AND SELECTION OF PILOTS IN THE SANDF

scope 1155
SCOPE (1)
  • Pilot Selection Process Flow.
  • Recruitment and Selection Process Task Organisation.
  • Academic Requirements.
  • Physical Requirements.
  • Medical Requirements.
  • Psychological Requirements.
scope 2156
SCOPE (2)
  • Potential Index Battery Functional Analysis (Competencies).
  • Final Selection.
  • Dover Systems Computerised Skills Assessment.
  • Benchmarking.
  • Vienna Research.
selection process flow
SELECTION PROCESS FLOW
  • Job Classification
  • Recruitment
  • Screening
  • Selection
  • Final Selection
  • Selection Board
  • Foundation Training
  • Basic Military Training
  • Officer Forming
  • Military Academy
recruitment and selection process task organisation 1
RECRUITMENT AND SELECTION PROCESSTASK ORGANISATION (1)
  • PHASE 1: ADVERTISING
    • CHRS (SAAF): Internally in DOD
    • CHRS (D PACQ): National and local newspapers
  • PHASE 2: PAPER SELECTION
    • CHRS (D PACQ): Minimum academic screening
    • CHRS (D PACQ): Provide SAAF with list of applicants who meet the minimum academic requirements
    • CHRS (SAAF): Divide applicants into groups for assessment and selection process
recruitment and selection process task organisation 2
RECRUITMENT AND SELECTION PROCESSTASK ORGANISATION (2)
  • PHASE 3: CALL-UP
    • DHRS (D PACQ): Notify qualified candidates
    • DHRS (SAAF): Provide specified arrival times of applicants
    • DHRS (SAAF): Responsible for transport, accommodation, meals and other logistic requirements of the applicants in Pretoria from the time they arrive
recruitment and selection process task organisation 3
RECRUITMENT AND SELECTION PROCESSTASK ORGANISATION (3)
  • PHASE 4: ORIENTATION
    • DHRS (SAAF): In conjunction with SAAF’s Dir Education, Training and Development; Dir Combat System’s Group; Dir Heli Systems Group and Dir Air Transport and Maritime System’s Group responsible for orientation programme for applicants
    • DHRS (SAAF): Responsible for briefing all applicants on assessment and selection process
recruitment and selection process task organisation 4
RECRUITMENT AND SELECTION PROCESSTASK ORGANISATION (4)
  • PHASE 4: ORIENTATION (cont)
    • OC MPI: Responsible for briefing applicants on psychometric assessments
    • OC IAM: Responsible for briefing applicants on aviation medical examination
recruitment and selection process task organisation 5
RECRUITMENT AND SELECTION PROCESSTASK ORGANISATION (5)
  • PHASE 5: PSYCHOMETRIC ASSESSMENTS
    • SAMHS (D PSYCH/MPI): Responsible for conducting, administering and interpretation of results for all psychometric assessments
    • External Service Providers: MPI execute responsibilities in cooperation with Dr Landman (16PF) and Ms Coetzee (Vienna) as agreed upon with SAAF
recruitment and selection process task organisation 6
RECRUITMENT AND SELECTION PROCESSTASK ORGANISATION (6)
  • PHASE 5: PSYCHOMETRIC ASSESSMENTS Cont)
    • DHRS SAAF: Responsible for financial compensation and logistic support for external service providers
    • DHRS (SAAF): Responsible for ensuring that applicants are available at assessment locations
recruitment and selection process task organisation 7
RECRUITMENT AND SELECTION PROCESSTASK ORGANISATION (7)
  • PHASE 5: PANEL INTERVIEW
    • CHRS (D PACQ): Chair panel interview for all candidates from each group who meet the set requirements for psychometric assessments
    • Other members of panel: CHRS (SAAF), DETD (SAAF), qualified pilot (SAAF), CHRS (D PACQ) and MPI
recruitment and selection process task organisation 8
RECRUITMENT AND SELECTION PROCESSTASK ORGANISATION (8)
  • PHASE 6: AVIATION MEDICAL EXAMINATION
    • D MEDICINE/IAM (SAMHS): Responsible for all administration, data collecting and record keeping of aviation medical examinations
    • D MEDICINE/IAM (SAMHS): Responsible for all arrangements wrt comprehensive aviation medical examinations
    • DHRS (SAAF): Ensure transport for candidates to IAM premises
recruitment and selection process task organisation 9
RECRUITMENT AND SELECTION PROCESSTASK ORGANISATION (9)
  • PHASE 7: COLLATION OF MEDICAL RECORDS
    • IAM (SAMHS): Provide DHRS (SAAF) with relevant medical reports of all candidates
    • DHRS (SAAF): Arrange meetings with D Medicine (SAMHS) and D Psychology (SAMHS) to discuss borderline health reports
recruitment and selection process task organisation 10
RECRUITMENT AND SELECTION PROCESSTASK ORGANISATION (10)
  • PHASE 8: CONSOLIDATED SELECTION BOARD
    • Consolidated Selection Board: Identify most suitable applicants according to SAAF requirements
    • Members of Consolidated Selection Board: CHRS (SAAF); CHRS (D PACQ); ETD (SAAF); Qualified Pilot (SAAF); D Med/IAM (SAMHS and D Psych (MPI)
    • DHRS (SAAF): Submit final selected list to CHRS (D PACQ)
academic requirements
ACADEMIC REQUIREMENTS
  • Maths - D(HG)
  • Science - D(HG)
  • English - Passed
  • Matric - University Exemption
  • M-Score - Rank Order
physical requirements
PHYSICAL REQUIREMENTS
  • Length - 166 – 192cm
  • Mass - 55 –100kg
basic medical
BASIC MEDICAL
  • Vision
  • Colour blindness
  • Hearing
  • Balance/Coordination
  • Physical condition (Fat content)
  • SAMHS Comprehensive Medical
medical requirements
MEDICAL REQUIREMENTS
  • Normal physical condition – all limbs and all sensors functioning normal
  • No spectacles
  • No speech impairment
  • No colour blindness
  • No hearing aids, pacemakers, etc.
  • No diabetes
  • No epilepsy
  • No diseases wrt lungs, heart, kidneys, etc.
  • No stuttering
  • No drug usage/dependency
  • No HIV/AIDS
psychological requirements
PSYCHOLOGICAL REQUIREMENTS
  • Goal-specific selection
    • High risk environment with little space or time for error
    • Determine baseline aptitude and profile
    • Minimise risk
    • 80% of fatal aircraft accidents are caused by human error
    • High cost of flying training
    • Screen out high risk candidates
    • Imperative that right candidates are being selected to be trained as military pilots
potential index battery functional analysis
POTENTIAL INDEX BATTERY FUNCTIONAL ANALYSIS
  • Hand-eye co-ordination
  • Conceptualisation (Spatial orientation)
  • Judgement
  • Decisiveness
  • Observance
  • Adaptibility
  • Calculations
  • Memory
  • Linguistic proficiency
  • Analytical thinking
aptitude tests
APTITUDE TESTS
  • Intellectual ability
    • Non-verbal reasoning
    • Verbal reasoning
  • Mathematical ability
  • Spatial orientation
  • Language Test
  • Psycho-motor Test Dover Vienna System)
  • Determination Test
  • Mental Health Screening Test
final selection
FINAL SELECTION
  • Flight Medical
  • Fitness
  • Leadership
  • Personality
  • Motivation for Flying
  • Selection Board Criteria
  • Career Orientation
  • Foundation Training Course
  • Scholastic Achievement
  • Psycho-motor abilities
dover systems computerised skills assessments vienna
DOVER SYSTEMSCOMPUTERISED SKILLS ASSESSMENTS (VIENNA)
  • Flight Crew Selection
  • Military Officer/Driver
  • Air Traffic/Combat control
  • Naval Operations
  • Weapon Delivery Skills
skills testing proper selection
SKILLS TESTINGPROPER SELECTION
  • Provides most suitable candidates
  • Identifies candidates with the required skills
  • Lessens the learning time
  • Reduces expenditure on poorly trainable candidates
test administration
TEST ADMINISTRATION
  • Test Administration
    • Computer-aided implementation of tests guarantees thet all instructions as well as the presentation of stimuli are equal for all subjects and independent of the test administrator.
  • Evaluation
    • Registration of data and comparison of norm samples is carried out automatically by the computer, thus the possibility of miscalculation is eliminated.
  • Interpretation
    • As the test is a standardised performance test, interpretation objectivity is evident.
application objective evaluations
APPLICATIONOBJECTIVE EVALUATIONS
  • Reaction Times
  • Perception Skills (Both Visual and Auditory)
  • Stress Coping
  • Decision-making Abilities
  • Communication Skills
  • Short and Long-term Memory
  • Co-ordination Skills
  • Learning Curve / Trainability
  • Attitudes
  • Levels of Aggression
validity
VALIDITY
  • Culture-free and low-cost
  • Highly valid and reliable assessment of piloting skills, driver skills as other well military related skills
  • Training (military) development (civil) can create ideal pool of candidates as well as identify special needs of candidates
  • Dover system has 20 years experience in psychomotor selection in developing countries
  • Based in South Africa with assessments done worldwide by arrangement
validity cont
VALIDITY (cont.)
  • System has been widely use within the African context for both military and civil uses
  • Lesotho, Rwanda,Malawi are among the countries that have used the system in the selection of flight and other military personnel
  • System is also employed in conjunction with premier air schools as 43 Air School and Flight Training Centre
  • Schools select and train candidates from SADC countries as well as North African countries such as Egypt and Kenya
validity cont183
VALIDITY (cont.)
  • System has been involved in selection of previously disadvantaged candidates for pilot training as sponsored by Aviation Training and Development Foundation (ATDF)
  • This was done on on nation wide bases and candidates were selected for Feeder Airlines SA Express and Airlink
  • System successful in selection program for previously disadvantaged community members and British Airways Pilot Development Program
benchmarking 1
BENCHMARKING (1)
  • UNITED KINGDOM
    • Officers and Aircrew Selection Centre.
    • Part 1 Selection Procedure:
    • Aptitude Testing
      • Verbal Reasoning: Interpretation and use of written or spoken information.
      • Numerical Reasoning: Interpretation and use of numerical information.
      • Capacitry: Dealing with multiple tasks involving aural and/or visual information, concentrating, noting changes, paying attention to detail.
      • Spatioal Ability: Mental visualisation and orientation.
benchmarking 2
BENCHMARKING (2)
    • Work Rate: Performing tasks quickly and accurately
    • Psychomotor: Co-ordination of eye-hand-foot.
    • English Test: Written English Language skills.
  • Medial Examination: Determine fitness.
  • Interview:
    • Appearance and bearing.
    • Manner.
    • Speech and Powers of Expression.
    • Activities and Interests.
    • Academic Level/Potential
    • Physical Level/Potential.
    • Awareness.
    • Motivation.
    • Overall Impact.
benchmarking 3
BENCHMARKING (3)
  • Part 2 Selection Procedure:
    • Discussion Exercise.
    • “Leaderless Exercise”.
    • Group Planning Exercise.
    • Individual Problem Exercise.
    • Command Situation Exercise.
  • Part 3 Selection Procedure:
    • Debrief
  • Part 4 Selection Procedure:
    • Integration of rating scores.
    • Feedback.
benchmarking 4
BENCHMARKING (4)
  • INDIA
    • Non-verbal test.
    • Personality test.
    • Clinical Screening.
    • Pilot Aptitude Test.
    • Simulation Exercises.
    • Physical Fitness Test.
    • Group Exercises.
    • Interview.
benchmarking 5
BENCHMARKING (5)
  • ISRAEL
    • Pilot Evaluation System (PES).
    • Standardised mass screening of pilot training candidates.
    • Simulated “real” conditions.
    • Identify “pilot training” candidates who possess the ability to focus their attention on a multitude of competing tasks while prioritising a steady stream of incoming data.
    • Scientifically based quantitative evaluation of performance of pilot candidates.
    • Simulated cockpit generates flight scenarios.
benchmarking 6
BENCHMARKING (6)
  • PAKISTAN
    • Selection System
      • Intelligence Test.
      • Academic Test.
      • Psychological Test.
      • Physical Test.
      • Interview.
      • Flying Aptitude Test.
    • Psychological Dimensions
      • Sentence completion.
      • Word association.
      • Thematic Apperception.
      • Self Description.
benchmarking 7
BENCHMARKING (7)
  • Group Exercises
    • Group discussion
    • Group planning
    • Half group task
    • Command task
    • Progressive group task
    • Individual obstacles
benchmarking 8
BENCHMARKING (8)
  • Flying Aptitude Test (FAT)
    • Cognitive component
      • Instrument Comprehension Spatial Orientation)
      • Vigilance (Cognition, Concentration, Attention)
      • Digit recall (Mental sharpness, Memory, Speed)
      • Attention Diagnostic Method
      • Defence Mechanism Test
    • Psychomotor component
      • Fly through (sharpness, Eye-hand coordination)
      • Target flying (Reflex action, Eye-hand-foot coordination)
benchmarking 9
BENCHMARKING (9)
  • GERMANY
    • Comprehensive Test Battery
      • Inductive reasoning
      • Spatial ability
      • Attention
      • Reactive capacity
      • Verbal and Visual Memory
      • Sensomotor coordination
    • Flight-simulator
benchmarking 10
BENCHMARKING (10)
  • SINGAPORE
    • Computerised Aptitude Test Battery
    • Psychomotor tasks
      • Hand-eye-foot coordination
      • Pursuit tracking
    • Cognitive tasks
      • Numerical and mechanical reasoning
      • System operation
challenges
CHALLENGES
  • Targeted recruitment
  • Identification of candidates at earlier stages
psychomotor ability vienna test system vts

PSYCHOMOTOR ABILITY:VIENNA TEST SYSTEM (VTS)

Research Overview on the use of the VTS in the SAAF Pilot Selection Test Battery

presentation agenda
PRESENTATION AGENDA
  • Background
    • SAAF Pilot Selection Battery – The Role of MPI
    • Vienna Test System
  • Practical Issues wrt SAAF Pilot selection
    • Recruitment
    • Representation
    • High failure rate on VTS
  • Practical Demonstration of the VTS
    • Interpreting Results
presentation agenda197
PRESENTATION AGENDA
  • Exploratory Research
    • The Three Clusters of Applicants
    • Information Processing: Coping Strategy, Audio Deficits and Concerns
    • Age Differences in VTS performance
    • Other Findings of Interest
the role of mpi
THE ROLE OF MPI
  • To make recommendations to the SAAF as to the psychological status of potential SAAF pilots in terms of
    • Basic Aviation related aptitudes
    • Mental status of applicants
saaf pilot profile
SAAF PILOT PROFILE
  • Based on a scientific Job Profile Analysis (JPI)
    • Professional Pilot Profile
      • Intellect, Aptitude, Language proficiency, Cognitive functioning under different STRESS situations
    • Professional Soldier Profile
      • Officer in the SANDF
    • Needs to be updated
      • Leadership, endurance (concentration ability) and realistic perceptions of flying
the process
THE PROCESS
  • Seven different Psychological tests administered over two days
    • Aptitude:
      • Intellectual ability
        • Non-Verbal
        • Verbal
      • Mathematical Ability
      • Spatial Orientation Ability
the process 2
THE PROCESS (2)
  • Language Proficiency
  • Personality Test
  • Psychomotor Test (Computerised)
    • Cognitive Functioning under Stress
    • Time and movement anticipation
    • Two Hand Coordination
  • Biographical Questionnaire
  • Clinical (psychopathology) Screening Test
  • Structured Clinical Interview (IAM)
cut off stages
CUT-OFF STAGES
  • Stage 1: Aptitudes and Language Proficiency
  • Stage 2: Psychomotor Test (Dover-Vienna Tests)
  • Stage 3: Clinical Assessment
the vienna test system vts204
THE VIENNA TEST SYSTEM (VTS)
  • Consists of numerous subtests (27+)
  • For Selection purposes:
    • Determination Unit (DT)
  • For Research purposes:
    • Cognitrone Test
    • Two-hand coordination Test
    • Pilot Spatial Test
    • Time-Movement Anticipation Test
  • Need to include Multi-tasking subtest
description of the test dt
DESCRIPTION OF THE TEST (DT)
  • Definition 1:
    • The Determination Test measures behavior under different levels of psychological and physiological stress, since the high frequency of signals puts almost everyone into an overcharge situation (Kisser, 1986:226)
description of the test dt206
DESCRIPTION OF THE TEST (DT)
  • Definition 2:
    • Hoyos (1969) defines stress as the incapacity of a highly motivated individual to find correct responses in a situation of extreme stimulus constellation (sic)
description of the test dt207
DESCRIPTION OF THE TEST (DT)
  • Definition 3:
    • Stress tolerance is the capacity of a person to resist a stimulus, i.e. to activate reactions in a certain situation in order to cope with it in the best way possible (Kisser et al., 1986, p226)
  • The Key Issue:
    • Coping Strategies for the Information Overload created in a Stress situation whether in a natural or artificially induced situation
application of determination test
APPLICATION OF DETERMINATION TEST
  • Measurement of reactive stress tolerance
  • Ability to give sustained multiple-choice reactions to rapidly changing stimuli
  • Detect attention deficit disorders and color blindness
theoretical background
THEORETICAL BACKGROUND
  • The DT measures Reactive Stress Tolerance and related reaction speed ito:
    • Discrimination of colours and acoustic signals,
    • The memorization of the relevant characteristics of stimulusconfigurations and response buttons as well as assignment rules,
    • The selection of the relevant reactions according to assignment rules
    • Continuous, sustained rapid and varied reactions to rapidly changing stimuli
objectivity
OBJECTIVITY
  • Test Administration
    • The computer-aided implementation of tests guarantees that the instructions as well as the presentation of stimuli are equal for all subjects and independent of the test administrator
  • Evaluation
    • Registration of data and comparison of norm samples is carried out automatically by the computer, thus the possibility of miscalculation is eliminated
  • Interpretation
    • As the test is a standardized performance test, interpretation objectivity is evident (Lienert, 1961)
evaluation
EVALUATION
  • Reliability
    • Internal Consistency (Cronbach Alpha) ranges between 0.86 and 0.99
    • Split-Half
      • Between 0.86 and 0.99
  • Validity
    • Construct Validity
      • Correlation between 0.3 and 0.8 with similar tests
    • Predictive Validity
      • Correlates with SAAF pupil pilot flying scores
  • Norms
    • International
    • Local (Pilots)
interpretation of test results
INTERPRETATION OF TEST RESULTS
  • Correct reactions on time
    • This indicate how well a subject adapts to a pre-set presentation duration of stimuli.
    • This ability (to adapt) depends on two factors:
      • Subjects have to pace their reaction time in such a way that they do not get off the track
      • They have to make sure that there is enough time in between each stimulus to make the right decision
    • T-scores above 60 and below 40, (or percentiles) above 84 and below 16) demonstrate a development of this ability above or below average, respectively
    • Poor performance is indicated by:
      • A low score of correct responses on time (compared to the norm sample)
      • A proportional decrease in the number of correct responses on time when the presentation time of the stimuli is diminished
interpretation
INTERPRETATION …..
  • Delayed and Omitted reactions
    • Usually, when the presentation time of the stimuli is decreased, a growing number of reactions are first delayed, then omitted.
      • This results from the fact that the speed at which the stimuli are presented accounts for the most difficult condition of the test
    • The initial increase in the number of delayed reactions versus omitted reactions is a normal function of our attention
      • This function guarantees that a reaction is screened from external distractions (in this case the interruption of the stimulus presentation) and thus is carried out even though a new stimulus appears.
    • A high number of omitted reaction (T-score below 40 due to reversed scale) combined with a low number of delayed reactions (T-score under 40) would therefore indicate attention deficits.
interpretation214
INTERPRETATION …
  • Incorrect Reactions
    • Incorrect reactions indicate the tendency to confuse stimuli
    • The Response Matrix can locate where such confusions accumulate
    • In contrast to delayed and omitted reactions, incorrect reactions are not so much an indicator of the difficulty of the test.
    • Usually, the number of incorrect responses increases only slightly when the presentation time of the stimuli is decreased
    • Incorrect reactions occur mainly because the subjects are unable to screen appropriate responses from concurrent and irrelevant external distractions
      • Thus, the variable “Incorrect Reactions” is closely linked to any attention deficits
    • The number of incorrect reactions indicates the subject’s tendency to give a rapid response at the very last moment under the pressure of limited presentation time
interpretation the three phases

Potential

Stress

Recovery

Overall

Interpretation: The Three Phases

Less than a Second

correct reactions on time

Stress

Potential

Recovery

Overall

CORRECT REACTIONS ON TIME

On Time

200.000

180.000

160.000

140.000

120.000

Number of Stimuli

100.000

80.000

60.000

40.000

20.000

0.000

ON TIME

INTERVAL 1

INTERVAL 2

INTERVAL 3

139.324

166.838

106.676

144.459

TOTAL GRP

165.778

177.000

151.583

168.750

ACCCEPTED

150.333

173.167

121.500

156.333

RESERVATION

119.140

158.421

73.632

125.368

NOT ACCEPTED

omitted responses a flight safety risk

Potential

Stress

Overall

Recovery

OMITTED RESPONSES: A FLIGHT SAFETY RISK

Number of Stimuli

delayed responses

Potential

Stress

Overall

Recovery

DELAYED RESPONSES

Number of Stimuli

incorrect responses

Potential

Stress

Overall

Recovery

INCORRECT RESPONSES

Number of Stimuli

ad