ipac conference the case for assessmentannex.ipacweb.org/library/conf/10/brink.pdf ·...

42
Insuring Content Validity in a Litigious Environment 2010 IPAC Conference Making the Case for Assessment 2010 IPAC Conference Making the Case for Assessment Newport Beach, CA Kyle E. Brink St. John Fisher College Jeffrey L. Crenshaw Brian L. Bellenger Centrus Personnel Solutions Centrus Personnel Solutions

Upload: others

Post on 09-Apr-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Insuring Content Validity in a Litigious Environment

2010 IPAC Conference Making the Case for Assessment2010 IPAC Conference – Making the Case for Assessment 

Newport Beach, CA

Kyle E. Brink

St. John Fisher College

Jeffrey L. Crenshaw

Brian L. Bellenger

Centrus Personnel SolutionsCentrus Personnel Solutions

Page 2: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Types of Validity EvidenceTypes of Validity Evidence

Uniform Guidelines APA Standards & SIOP PrinciplesUniform Guidelines

• Types of validity studies– Content

APA Standards & SIOP Principles

• Sources of validity evidence– Test content

– Criterion‐oriented• Predictive

• Concurrent

– Relations to other variables• Test‐criterion relationship

– Predictive vs. concurrentCo cu e t

– Construct• Convergent

• Discriminant

• Convergent & discriminant

• Validity generalization

– Response processesDiscriminant Response processes

– Internal structure of the test

– Consequences of testing

Page 3: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Content ValidityContent Validity

D t h i th t th t t f th l ti• Data showing that the content of the selection procedure is representative of important aspects of performance on the job for which the candidates are to be evaluated. (UGESP)

• Evidence based on test content may include logical or empirical analyses that compare the adequacy of theempirical analyses that compare the adequacy of the match between test content and work content, worker requirements, or outcomes of the job. Test content includes the questions, tasks, format, and wording of questions, response formats, and guidelines regarding administration and scoring of the test. (Principles)administration and scoring of the test. (Principles)

Page 4: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Content Validation StepsContent Validation Steps

• Job analysisJob analysis

• Test development

d i i i• Test administration

• Test assessment

• Data analysis & scoring

Page 5: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Required Documentation (UGESP)Required Documentation (UGESP)

• Users of selection procedures should maintain and have• Users of selection procedures should maintain and have available for each job information on adverse impact

• If selection process has adverse impact, must maintain and have available validity evidence including:have available validity evidence including:– User(s), location(s) and date(s) of study– Problem and setting

J b l i C t t f th j b– Job analysis ‐ Content of the job– Selection procedure and its content– Relationship between the selection procedure and the job

l d d– Alternative procedures investigated– Uses and applications– Contact person– Accuracy and completeness

Page 6: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Required Documentation (Principles)Required Documentation (Principles)

Reports of validation efforts should include enough detail toReports of validation efforts should include enough detail to enable a researcher competent in personnel selection to know what was done, to draw independent conclusions in evaluating the research, and to replicate the study. Theevaluating the research, and to replicate the study. The following information should be included:

• Identifying Information• Statement of Purpose

• Research Sample• Results• Statement of Purpose

• Analysis of Work• Search for Alternative Selection 

Procedures

• Results• Scoring and Transformation of 

Raw Scores• Normative InformationProcedures

• Selection Procedures• Relationship to Work 

Requirements

Normative Information• Recommendations• Caution Regarding Interpretations• References

• Criterion Measures (When Applicable)

Page 7: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Job Analysis DefinedJob Analysis Defined

• SystematicSystematic 

• process of discovery of the nature of a job 

b di idi i i ll i• by dividing it into smaller units, 

• where the process results in one or more written products with the goal of describing what is done in the job or what capabilities are needed to effectively perform the job– Brannick, Levine & Morgeson (2007)

Page 8: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Job Analysis OverviewJob Analysis Overview

• Methods of collecting dataMethods of collecting data

• Sources of data

C & i f l i• Content & unit of analysis

Page 9: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Job Analysis MethodsJob Analysis Methods

• Methods of gathering job analysis dataMethods of gathering job analysis data– Review of prior information

Observation– Observation

– Interview

P l/f– Panel/focus group

– Questionnaire

Page 10: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Job Analysis MethodsJob Analysis Methods

• What are the advantages and disadvantagesWhat are the advantages and disadvantages to using multiple methods of job analysis for a particular job?particular job?

• Utilizing multiple methods strengthens the job analysis and increases legal defensibilityanalysis and increases legal defensibility (Thompson & Thompson, 1982; Veres, Lahey & Buckly, 1987)

Page 11: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Job Analysis MethodsJob Analysis Methods

Job description/specificationJob description/specification• Product of job analysis

D id h d h b d h f– Do not provide the depth or breadth of information needed to support content validity

• Are often out of date (B tt 1998 N H ll b k• Are often out of date (Barrett, 1998; Noe, Hollenbeck, Gerhart & Wright, 2008)

• Rarely documentation regarding their y g gdevelopment

• Not a substitute for job analysis!j y

Page 12: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Job Analysis SourcesJob Analysis Sources

• Job analystJob analyst

• Job incumbents

S i• Supervisors

• Subject matter experts (SMEs)

• How many?How many?

Page 13: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Job Analysis Content & Unit of AnalysisJob Analysis Content & Unit of Analysis

• Work behaviors (duties) and/or tasksWork behaviors (duties) and/or tasks– Not too general, not too specific, but just right

• Knowledge• Knowledge– Operationally defined as a body of learned information required for performing workinformation required for performing work behaviors

• Skills & abilities• Skills & abilities– Operationally defined in terms of work behavior

Page 14: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Identifying KSAOsIdentifying KSAOs

• Always start job analysis with O*NETAlways start job analysis with O NET– See O*NET Content Model

• https://cliffie2.sjfc.edu/exchweb/bin/redir.asp?URL=http://www.onetcenter.org/dl_files/ContentModel_DetailedDesc.pdf

– Abilities: based on Fleishman taxonomy

– Knowledge & skills: not sufficiently specific

• Fill in gaps with job analysis methods/sources

Page 15: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Rating Behaviors & KSAsRating Behaviors & KSAs

• Work behaviors and/or tasksWork behaviors and/or tasks– Importance/criticality– Frequency– Needed upon entry

• Knowledge, skills & abilitiesI t / iti lit– Importance/criticality

– Needed upon entry– Distinguishing valueg g– Memorization (knowledge only)

• Behavior to KSA linkage

Page 16: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Common Issues in Job AnalysisCommon Issues in Job Analysis

• Jobs evolveJobs evolve• Assuming job description/specification is accurate, current, and sufficient

• Failure to sufficiently customize off‐the‐shelf job analysesF il t l lti l th d /• Failure to rely on multiple methods/sources

• Failure to gather linkage, needed at entry, and memorization ratingsmemorization ratings

• SME capabilities• SME motivation & resistance

Page 17: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Test DevelopmentTest Development

• What do we mean by “valid” test?–Accuracy of predictionAccuracy of prediction

• The extent to which performance on the measure is associated with performance on the pjob

–Accuracy of measurement• Degree to which a measure truly measures the attribute it is intended to measure

Source: Heneman & Judge (2009)

Page 18: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Accuracy of MeasurementAccuracy of Measurement

Source: Heneman & Judge (2009)

Page 19: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Test DevelopmentTest Development

• What do we mean by “content valid” test?– Representative of the job– Resemble the job

• How do we develop a content valid test?– Review job analysis results

– Develop a selection plan

– Gather critical incidents

– Develop exercise/questions, benchmarks, and rating scalescale

Page 20: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Test DevelopmentTest Development

• Review job analysis results (duties/tasks & KSAs)– Only assess what is important and needed upon entry

– If assessing knowledge, appropriately use open vs. closed book testing depending on whether knowledge is referenced vs. memorized

• Failing to assess knowledge that can be referenced is a systematic bias in contentsystematic bias in content

Page 21: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Test DevelopmentTest Development

• Develop the selection plan– What are the most important KSAs?Wh t i th b t th d ( lt ti ) f– What is the best method (alternative measures) for assessing these KSAs?

• Adverse impactp• Validity• Face validity/applicant reactionsC l ti ith th di t /i t l lidit• Correlations with other predictors/incremental validity

• Utility• Relationship to the job duties• Others?

Page 22: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Test DevelopmentTest Development

• Gather critical incidents

1 Ch d t KSA hi h t b th1. Choose a duty or KSA upon which to base the critical incident (e.g., customer service)

2 Wh h i i l i id ?2.   What was the critical incident? – While waiting tables at a restaurant, a server did not place a customer’s order into the computer system p p yright away. After 15 minutes the customer questions the server as to why their food had not arrived. At this point the server realized the mistake. As a pconsequence, the customer left no tip for the server.

Page 23: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Test DevelopmentTest Development

G th iti l i id t• Gather critical incidents (continued)3. What was the situation leading up to the critical 

incident?incident?  – The server was waiting tables at a restaurant. The restaurant was very busy and two of the scheduled servers called in sick that morning.called in sick that morning. 

4. What were the actions or behaviors of the person of interest in the incident?The ser er did not place the c stomers order immediatel– The server did not place the customers order immediately. 

5. What were the results or outcomes of those actions? – The server angered the customer and received no tip.

6. What level of performance does this event represent?

Page 24: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Test DevelopmentTest Development

• Gather critical incidents (continued)7. Develop a question/exercise based on this p q /

critical incident– Suppose you are waiting tables at a restaurant and you forgot to place a customer’s order. After 15 minutes, the customer asks you why the food has not arrived. At this point you realize your mistake. What p y ywould you do? What would you say to the customer?

8. Develop a scoring rubric/benchmarks based on hi i d i i l i idthis question and critical incident

Page 25: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Test DevelopmentTest Development

• Develop questions, benchmarks, and rating scales– Make sure they are: related to the job, relevant to the 

scenario/question, related to the KSA being measured, andscenario/question, related to the KSA being measured, and behaviorally‐based

– Examples:• Apologized for the errorApologized for the error• Confirmed that they had the order correct• Arranged for a free appetizer to be brought out while the customers waited

• Once developed:– Have SMEs review the test

Have SMEs document (e g via ratings) their judgments– Have SMEs document (e.g., via ratings) their judgments regarding the validity of the exam

Page 26: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Test AdministrationTest Administration

• Lack of consistency in treatment of applicants isLack of consistency in treatment of applicants isa major factor contributing to discrimination

• Must be standardized– Equal opportunity to prepare for test– Equal access to preparation/orientation materials– Identical test content– Identical test administration– Opportunity for appeal

• Problems– Unstructured interview– Orientation only during 1 shift– Variability in role‐players

b l / d– Variability in interviewers/administrators

Page 27: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Test AdministrationTest Administration

• Video and computer interviews offerVideo and computer interviews offer – Cost savings– Increased standardization

Page 28: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Situational Interview QuestionSituational Interview Question

• Suppose you are a Bailiff and the trial you are working• Suppose you are a Bailiff and the trial you are working at is on a short recess.  You hear voices beginning to get louder and louder outside the back of the 

lk h i icourtroom so you walk over to see what is going on.  You see that a man and woman are arguing loudly.  The woman is accusing the man of lying and trying to get g y g y g gthe kids.  The man is accusing the woman of trying to take his money.  The exchange is very heated and they are saying mean things You also observe the womanare saying mean things.  You also observe the woman shove the man.

• After observing this situation, what would you do?g , y

Page 29: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –
Page 30: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Test AssessmentTest Assessment

• Ensure standardized assessment– Identical scoring rubric/evaluation criteria across all candidates

• Ensure reliable and accurate ratings– Assessor training

• How to assess candidates• How to assess candidates• Familiarity with questions, rating scales• Consistency in evaluation

– Procedural issues• Panels (number of assessors)• Internal v. external• Demographics of assessors?• Rotation of assessors?• Quality checks?

Page 31: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Data Analysis & ScoringData Analysis & Scoring

• Minimize human error– Data entry data analysis data interpretationData entry, data analysis, data interpretation

• ReportCentral tendency & dispersion– Central tendency & dispersion

– Item analyses (difficulty, discrimination)

G diff & d i t– Group differences & adverse impact

Page 32: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Data Analysis & ScoringData Analysis & Scoring

• Reliability: Definition– The degree to which a measure of physical or cognitive abilities, or traits, is free from random error

• Actual score = true score + error• Actual score = true score + error

– Consistency, dependability, or stability of measurement of an attributemeasurement of an attribute

• A measure is reliable to the extent it provides a consistent set of scores to represent an attribute

Page 33: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Data Analysis & ScoringData Analysis & Scoring

f l b l• Sources of unreliability– Test content

• Item sampling• Item sampling

– Test structure• Chance response tendencies, guessing, item format

S d di i f d i i i– Standardization of test administration• Standardization of measure• Situational or environmental variables• Person variables

– Standardization of assessment• Rater error

Page 34: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Data Analysis & ScoringData Analysis & Scoring

• Estimating reliability– Test‐retest reliabilityy

• Concerned with stability of measurement

–Coefficient alphaCoefficient alpha• Concerned with interrelationships among items

– Interrater agreementInterrater agreement• Concerned with agreement among raters

Page 35: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Data Analysis & ScoringData Analysis & Scoring

• Reliability standards  (Nunnally & Bernstein, 1994)– Early stages of research rxx > .70y g xx

– If making important decisions w/ respect to specific test scores (“high stakes testing”)

• rxx > .90 is the bare minimum• rxx > .95 should be considered a desirable standard

Page 36: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Data Analysis & ScoringData Analysis & Scoring

• Reliability implications– As reliability increases,

• More confidence in individuals’ obtained scoresMore confidence in individuals  obtained scores• More confidence in differences between individuals’ scores

• Proof (FYI):– Standard error of measurement– Standard error of measurement

• Since only one score is obtained from an applicant, the critical issue is how accurate the score is as an indicator of an applicant’s true level of knowledge/abilitytrue level of knowledge/ability

• As reliability increases, SEM decreasesy ,• There is a 95% chance a persons true score falls between + 2 SEM

Page 37: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Data Analysis & ScoringData Analysis & Scoring

• Reliability implications– Relationship to validity

• Reliability of a measure places an upper limit on the possible lidi fvalidity of a measure

• Reliability does not guarantee validity ‐ it only makes it possible• A highly reliable measure is not necessarily valid

– Proof (FYI):• rxy represents the empirically determined validity coefficient

t th l ti b t th th ti l t• rx∞y∞ represents the correlation between the theoretical true predictor (i.e., the test) and true criterion scores (e.g., job performance)

• rxx represents the reliability of the predictorxx p y p• ryy represents the reliability of the criterion

Page 38: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Reliability & ValidityReliability & Validity

Page 39: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Data Analysis & ScoringData Analysis & Scoring

• Methods for combining and weighting items and predictors

When sho ld a compensator model be sed? When– When should a compensatory model be used? When should a multiple hurdles model be used?

• Critical vs. cut scoreCritical vs. cut score– What are the positive and negative consequences of using a high predictor cut score?

• Top‐down vs. random vs. banding – What are the advantages of ranking vs. random 

l ti ?selection?

Page 40: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

Questions?

Page 41: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

References & Source MaterialsReferences & Source Materials

• American Educational Research Association, American Psychological Association, & National American Educational Research Association, American Psychological Association, & National Council for Measurement in Education. (1999). Standards for educational and psychological testing. Washington, DC: American Psychological Association.

• Barrett, R. S. (1996).  Fair employment strategies in human resource management. Westport, CT: Quorum Books.Q

• Barrett, R. S. (1998).  Challenging the myths of fair employment practices. Westport, CT: Quorum Books.

• Biddle, D. (2006). Adverse Impact and Test Validation (2nd ed.). Hampshire, England : Gower Publishing Limited.g

• Brannick, M. T., Levine, E. L., & Morgeson, F. P. (2007).  Job and work analysis: Methods, research, and applications for human resource management (2nd ed.).  Thousand Oaks, CA: Sage Publications.

• Cascio, W. F. (1998). Applied Psychology in Human Resource Management (5th Edition). , ( ) pp y gy g ( )Prentice Hall, Upper Saddle River, NJ.  

• Equal Employment Opportunity Commission, Civil Service Commission, Department of Labor, & Department of Justice. (1978). Uniform guidelines on employee selection procedures. Federal Register, 43, 38290‐38315. p g , ,

• Gatewood, R. D., Feild, H. S., & Barrick, M. (2008). Human resource selection (6th ed.). Mason, OH: South‐Western Cengage Learning.

Page 42: IPAC Conference the Case for Assessmentannex.ipacweb.org/library/conf/10/brink.pdf · 2013-02-01 · Insuring Content Validity in a Litigious Environment 2010 IPAC Conference –

References & Source MaterialsReferences & Source Materials

• Guion, R. M. (1998).  Assessment, measurement, and prediction for personnel decisions.Mahwah, NJ: Lawrence Erlbaum Associates.

• Heneman, H. G., & Judge, T. A. (2009). Staffing organizations (6th ed.).  Madison, WI: Mendota House/Irwin.

• Noe, R. A., Hollenbeck, J., Gerhart, B., & Wright, P. (2008). Human resource management: Gaining a competitive advantage (6th ed.). New York: McGraw‐Hill.

• Nunnally, J.C. & Bernstein, I.H. (1994). Psychometric theory (3rd ed.). New York: McGraw‐Hill.• Phillips, J. M., & Gully, S. M. (2009). Strategic Staffing. Upper Saddle River, NJ: Pearson 

Prentice Hall.• Society for Industrial and Organizational Psychology, Inc. (2003).  Principles for the Validation 

and Use of Personnel Selection Procedures (4th ed.). Bowling Green, OH: Society for Industrial and Organizational Psychology, Inc.

• Thompson, D. E., & Thompson, T. A. (1982).  Court standards for job analysis in test validation.  Personnel Psychology, 35, 865‐874.

• Veres, J. G., Lahey, M. A., & Buckly, R. (1987). A practical rationale for using multi‐method job analyses.  Public Personnel Management, 16, 153‐157.